Even though this has been discussed a reasonable amount., I think there’s continuous need for discussion and iteration around topics like perks, salaries and work cultre.
I think on the work retreat front there are so many options which are far cheaper than the bougie hotel one for remote workers that you could explore. Perhaps hire a simple but nice Air B&B for the team near the beach, or find a simple girl guide/boy scout retreat center which can host people really well, and at a fraction of the cost often with nice outdoor and indoor activities around (table tennis/pool/ropes course/river etc.)
My point of disagreement will be on the salaries which I think should also be lower, but that’s kind of aside from the article’s main point. I also think (although not sure) I find myself in the minority on that front on the forum here as I seem to take both a karma (which I don’t understand) and an agreement (which I do understand) hammering when I mention it ;).. However I LOVE your comment here about salary tradeoffs
”I don’t have a strong general opinion about this. The only clearheaded point I can make is that overpaying and underpaying both send powerful signals; sometimes those signals are what you want to be sending and other times they’re really not. Carefully consider what you’re signalling with your pay structure, e.g., high salaries might say: “everyone is equal” or “we hire only the best” or “we expect you are compromising your values to work here”. And low salaries might say: “everyone is equal” or “we’re all here because we care so much” or “we don’t value your work”.”
I also think Greed may habe also been an underrated corrupting force within Effective Altruism within recent scandals, although its difficult to tease out from other forces like lust for power. I might have a go at writing about it soon.
I’m gonna push the general line on this and say “it’s also the job of individual orgs to negotiate their salaries with individual people”. I have time for a discussion of vibes but I think markets outcompete central planning.
Likewise, there is clearly another bad equilibrium where orgs spend roughly the same amount of money on worse outcomes because they are scared of looking bougie.
This market logic doesn’t apply when essentially all of EA’s money comes from one source. These aren’t companies with diverse revenue streams from product sales. Open Phil can tomorrow make every EA org pay less by giving each org less money, if they want.
That too. I find the widespread reluctance to point the finger at OpenPhil for perceived problems in EA completely bizarre. Whatever problem you think you’ve identified within or across orgs, I can almost guarantee OpenPhil can fix it because one way or another, they’re funding it. That they haven’t done so to date presumably indicates they don’t think it’s a problem or they simply don’t care.
Yeah a significant consideration for me in whether to be less professionally involved in EA is exhaustion from centralized funding and the weird power dynamics that ensue. I would rather build products that lots of people can use and lots of investors or donors would find attractive to give money to than be beHolden to a small coterie of grantmakers no matter how well-intentioned.
I don’t think that’s quite fair. There are good reasons for major funders to be hesitant to use the funding hammer to micromanage grantees. One could conclude in some cases that there is a problem but that trying to fix it with the funding hammer poses too many downsides.
It is indeed a blunt instrument and might not scale well. But this is probably evidence that there are too many EA orgs, which is once again, the fault of no one other than....OpenPhil.
I don’t think anyone is advocating for centralized planning (which would be generally illegal anyway). But I do think an org-focused compensation strategy poses serious risks of drawing talent to the best-funded org rather than the org at which their marginal impact would be highest. In the for-profit world, the scrappy upstart can offer equity to compensate for lower base pay, but that’s not a thing in the non-profit world.
I agree that markets outcompete central planning, hence why I think the purpose of the discussion is as much to sway the market a little through shifting the views of those EA types applying for the jobs—for example if I managed to convince a bunch of people that lower salaries were the best way forward then the micro EA job market would shift.
I have asked for lower salaries before a couple of times, but I doubt that’s a common practise.
Thanks for this fantastic post I love it!
Even though this has been discussed a reasonable amount., I think there’s continuous need for discussion and iteration around topics like perks, salaries and work cultre.
I think on the work retreat front there are so many options which are far cheaper than the bougie hotel one for remote workers that you could explore. Perhaps hire a simple but nice Air B&B for the team near the beach, or find a simple girl guide/boy scout retreat center which can host people really well, and at a fraction of the cost often with nice outdoor and indoor activities around (table tennis/pool/ropes course/river etc.)
My point of disagreement will be on the salaries which I think should also be lower, but that’s kind of aside from the article’s main point. I also think (although not sure) I find myself in the minority on that front on the forum here as I seem to take both a karma (which I don’t understand) and an agreement (which I do understand) hammering when I mention it ;).. However I LOVE your comment here about salary tradeoffs
”I don’t have a strong general opinion about this. The only clearheaded point I can make is that overpaying and underpaying both send powerful signals; sometimes those signals are what you want to be sending and other times they’re really not. Carefully consider what you’re signalling with your pay structure, e.g., high salaries might say: “everyone is equal” or “we hire only the best” or “we expect you are compromising your values to work here”. And low salaries might say: “everyone is equal” or “we’re all here because we care so much” or “we don’t value your work”.”
I also think Greed may habe also been an underrated corrupting force within Effective Altruism within recent scandals, although its difficult to tease out from other forces like lust for power. I might have a go at writing about it soon.
I’m gonna push the general line on this and say “it’s also the job of individual orgs to negotiate their salaries with individual people”. I have time for a discussion of vibes but I think markets outcompete central planning.
Likewise, there is clearly another bad equilibrium where orgs spend roughly the same amount of money on worse outcomes because they are scared of looking bougie.
Hard problem.
This market logic doesn’t apply when essentially all of EA’s money comes from one source. These aren’t companies with diverse revenue streams from product sales. Open Phil can tomorrow make every EA org pay less by giving each org less money, if they want.
Or probably even by writing grant terms in a certain way...
That too. I find the widespread reluctance to point the finger at OpenPhil for perceived problems in EA completely bizarre. Whatever problem you think you’ve identified within or across orgs, I can almost guarantee OpenPhil can fix it because one way or another, they’re funding it. That they haven’t done so to date presumably indicates they don’t think it’s a problem or they simply don’t care.
Yeah a significant consideration for me in whether to be less professionally involved in EA is exhaustion from centralized funding and the weird power dynamics that ensue. I would rather build products that lots of people can use and lots of investors or donors would find attractive to give money to than be beHolden to a small coterie of grantmakers no matter how well-intentioned.
I don’t think that’s quite fair. There are good reasons for major funders to be hesitant to use the funding hammer to micromanage grantees. One could conclude in some cases that there is a problem but that trying to fix it with the funding hammer poses too many downsides.
It is indeed a blunt instrument and might not scale well. But this is probably evidence that there are too many EA orgs, which is once again, the fault of no one other than....OpenPhil.
I don’t think anyone is advocating for centralized planning (which would be generally illegal anyway). But I do think an org-focused compensation strategy poses serious risks of drawing talent to the best-funded org rather than the org at which their marginal impact would be highest. In the for-profit world, the scrappy upstart can offer equity to compensate for lower base pay, but that’s not a thing in the non-profit world.
Nice one Nathan
I agree that markets outcompete central planning, hence why I think the purpose of the discussion is as much to sway the market a little through shifting the views of those EA types applying for the jobs—for example if I managed to convince a bunch of people that lower salaries were the best way forward then the micro EA job market would shift.
I have asked for lower salaries before a couple of times, but I doubt that’s a common practise.