EA jobs provide scarce non-monetary goods
Epistemic status: hypothesizing
Related: It is really, really hard to get hired by an EA organisation
Also related: The career coordination problem, A guide to improving your odds at getting a job in EA, EA and meaning in life, EA is vetting-constrained, What to do with people?, Identifying talent without credentialing in EA, SHOW: A framework for shaping your talent for direct work, The career and the community
As the recent catalyzing EA jobs post was blowing up, a friend of mine observed that from a simple economics perspective, the obvious response to this state of affairs would be to pay labor less (i.e. lower compensation for the professional EA roles being hired for).
Following a simple economics framework, lowering salaries would in turn lower demand for the jobs, resulting in fewer applications and less competition. Fewer people would want professional EA jobs, but those who did would find them easier to get.
When my friend said this, it seemed clear to me that lowering salaries wouldn’t have the proposed effect.
After some discussion, I arrived at a hypothesis for why not: professional EA organizations provision scarce, non-monetary goods to their employees.
Specifically, working at a professional EA organization can provide the following non-monetary benefits (in no particular order):
Social status. In the EA & rationality subcultures, working at professional EA organizations is high status (e.g. when I started at GiveWell, I was surprised at how people in these circles treated me when they found out I was working there, even though I was an entry-level employee).
Meaning-making / life orientation. At least for me, working at EA organizations can provide a sense of resolution to existentialist questions of purpose & meaning, at least at first. (e.g. “What’s the point? The point is to help as many people as possible with the limited resources at hand.”)
A sense of having a near-maximal altruistic impact. Working at EA organizations can provide reassurance that you’re doing the best you can / working on a near-optimal project (or perhaps working on a near-optimal project, modulo the current set of projects that can be worked on). I think this sense is stronger for working on more meta-level stuff, like career-advising or grant-making. (Also note that social status seems to correlate with how meta the project is; e.g. compare the demand to work at Open Phil or 80,000 Hours with the demand to work at New Incentives.)
Being part of a value-aligned, elite tribe. Somewhat mixed in with the above points, I think there’s a lot of value to be had from feeling like a member of a tribe, especially a tribe that you think is awesome. I think working at a professional EA organization is the closest thing there is to a royal road to tribal membership in the EA community (as well as the rationality community, to a lesser extent).
Because these goods are non-monetary, it’d be difficult for EA organizations to reduce their quantity even if they wanted to (and for the most part, they probably don’t want to, as degrading such would also degrade large parts of what makes EA worthwhile).
This leads me to think that demand for jobs at professional EA organizations will continue to be very high for the foreseeable future, and especially so for meta-level EA organizations.
For reference, I spent two years as a research analyst at GiveWell, then two years as head of risk at Wave. These days I’m doing independent research, occasional contract work for Ought & toying with a startup idea.
As of 2022-06-01, the certificate of this article is owned by Milan Griffes (100%).
- Illegible impact is still impact by 13 Feb 2020 21:45 UTC; 134 points) (
- The availability bias in job hunting by 30 Apr 2022 14:53 UTC; 105 points) (
- The career and the community by 21 Mar 2019 12:35 UTC; 93 points) (
- A Framework for Thinking about the EA Labor Market by 8 May 2019 19:33 UTC; 83 points) (
- When in doubt, apply* by 23 Jan 2022 19:07 UTC; 66 points) (
- How valuable is ladder-climbing outside of EA for people who aren’t unusually good at ladder-climbing or unusually entrepreneurial? by 1 Sep 2021 0:47 UTC; 39 points) (
- Hiring non-EA Talent: Pros & Cons by 13 May 2024 16:07 UTC; 37 points) (
- Complex value & situational awareness by 16 Apr 2019 18:42 UTC; 15 points) (
- 9 May 2019 15:38 UTC; 10 points) 's comment on EA jobs provide scarce non-monetary goods by (
- 19 Aug 2020 21:36 UTC; 8 points) 's comment on The case of the missing cause prioritisation research by (
- 9 Sep 2023 15:00 UTC; 7 points) 's comment on Career Conversations Week on the Forum (8-15 September) by (
- 26 Mar 2021 6:35 UTC; 4 points) 's comment on Propose and vote on potential EA Wiki entries by (
- 18 Oct 2020 12:22 UTC; 3 points) 's comment on Longtermist reasons to work for innovative governments by (
- 3 Apr 2021 7:07 UTC; 2 points) 's comment on Possible gaps in the EA community by (
- Ben Hoffman & Holden Karnofsky by 20 Mar 2021 4:07 UTC; 0 points) (
- EA capital allocation is an inner ring by 19 Mar 2021 4:06 UTC; -57 points) (
(Obvious CoI/own views, but in my defence I’ve been arguing along these lines long before I had—or expected to have—an EA job.)
I agree ‘EA jobs’ provide substantial non monetary goods, and that ‘supply’ of willing applicants will likely outstrip available positions in ‘EA jobs’. Yet that doesn’t mean ‘supply’ of potential EA employees is (mostly) inelastic to compensation.
In principle, money is handy to all manner of interests one may have, including altruistic ones. Insofar as folks are not purely motivated by altruistic ends (and in such a way they’re indifferent to having more money to give away themselves) you’d expect them to be salary-sensitive. I aver basically everyone in EA is therefore (substantially) salary-sensitive.
In practice, I know of cases (including myself) where compensation played a role in deciding to change job, quit, not apply etc. I also recall on the forum remarks from people running orgs which cannot compensate as generously as others that this hurts recruitment.
So I’m pretty sure if you dropped salaries you would reduce the number of eager applicants (albeit perhaps with greater inelasticity than many other industries). As (I think) you imply, this would be a bad idea: from point of view of an org, controlling overall ‘supply’ of applicants shouldn’t be their priority (rather they set salaries as necessary to attract the most cost effective employees). For the wider community point of view, you’d want to avoid ‘EA underemployment’ in other ways than pushing to distort the labour market.
Totally agree. I think we’re aligned on all your points.
I’d expect a reduction but not a drastic one. Like I’d predict Open Phil’s applicant pool to drop to 500-600 from 800 if they cut starting salary by $10k-$15k.
Right, I (mis?)took the OP to be arguing “reducing salaries wouldn’t have an effect on labour supply, because it is price inelastic”, instead of “reducing salaries wouldn’t have enough of an effect to qualitatively change oversupply.
Aside:
This roughly cashes out to an income elasticity of labour (/applicant) supply of 1-2 (i.e. you reduce applicant supply by ~20% by reducing income ~~10%). Although a crisp comparison is hard to find, in the labour market you see figures generally <1, so this expectation slightly goes against the OP, given it suggests EA applicants are more compensation sensitive than typical.
Right, this^ is what I mean.
Oh, interesting.
It was a rough, off-the-top-of-my-head prediction, so I wouldn’t give the specific numbers too much weight.
That said, there’s probably a gradient of applicant income elasticity here (and in most places? I don’t know very much about labor economics).
I’d expect dropping salaries by $10k to reduce the applicant pool substantially, and by $20k to reduce it somewhat more. But there’s probably a hard core of applicants whose demand is quite inelastic (i.e. who would be excited to apply to EA jobs regardless of whether they paid $35,000 or $70,000).
And probably also there’s some lower bound where anything under it is too little to live on, such that setting salaries under that threshold would cause a sharp drop-off in applications.
Strong upvote. I think this is exactly it.
Totally agree that EA jobs provide scarce non-monetary goods, and hat tip for looking at this issue through a standard economics lens.
I’ve argued that compensation norms based on offsetting low salaries with high non-monetary pay are problematic in part because they create unwanted biases in which sorts of candidates they attract. If you pay people with money, they can use that however they want. If you pay them in e.g. flexible hours or social status, there’ll be variability in how much people value that and you’ll disproportionately attract candidates who value it a lot. For example, I argue experienced candidates will likely prefer monetary compensation relative to inexperienced candidates for several reasons:
Good post! I share Greg’s doubts about the particular question of salaries (and think that lowering them would have several bad consequences), but I think you’ve summed up most of the major things that people get, or hope to get, from jobs at EA organizations.
Other than your reasons and “money”, I’d include “training”; if you want to learn to do Open Phil-style research, working at Open Phil is the most reliable way to do this.
Are there any examples of this that stand out to you? I can certainly believe that it happened, but I’m having trouble picturing what it might look like.
(Since I began working at CEA five months ago, I haven’t noticed any difference in the way my interactions with people in EA have gone, save for cases where the interaction was directly related to my job. But perhaps there are effects for me, too, and I just haven’t spotted them yet.)
I think you’re right that EA work is a quick way to feel like part of the tribe, and that’s something I’d like to change.
So I’ll repeat what I’ve said in the comments of other posts: If you believe in the principles of EA, and are taking action on them in some way (work, research, donations, advocacy, or taking steps to do any of those things in the future), I consider you a member of the EA “tribe”.
I can’t speak for any other person in EA, but from what I’ve heard in conversations with people at many different organizations, I think that something like my view is fairly common.
+1, though I don’t think this is common knowledge. (And even if a lot of people say this, many people may still not believe it.)
Eh, just feeling like I was given a lot of conversational space at parties, and people occasionally asking me a bunch of questions about GiveWell research and/or what working there was like.
Data points:
We offered a job that didn’t offer any monetary reward at all (except for a place in the EA Hotel) and we still got 10 applications.
When we offered a job with a negative salary, we didn’t get any applicants (yet)
Obviously, these numbers might be influenced by many factors besides pay.
Wow, I’m surprised that a job which only paid out room & board received 10 applications!
Is the job structured such that people would have enough time to work on other projects, or to hold another job alongside?
Nope, it’s full-time. Right now two of us are doing a side project, but that’s not usual
[I work at 80,000 Hours]
It seems like you’re assuming that it would be better if EA organisations could make their jobs less desirable, in order to put off applicants so that the jobs would be less competitive. That doesn’t seem right to me.
Making the jobs less desirable is likely to either put off applicants at random, or even disproportionately put off the most experienced applicants who are most picky about jobs. That would seem reasonable to do if EA orgs were getting plenty of applicants above the bar to hire, and didn’t think there would be much difference in job performance amongst them. But that doesn’t seem to be the situation these organisations are reporting. Given that, we’d expect that reducing applicants by making the jobs less desirable would harm the beneficiaries of the organisations.
I don’t think I’m assuming that. See this part of the original post:
I don’t think it would necessarily be good for EA organizations to make their jobs less desirable. I feel agnostic about that question.
My main conclusion here is that EA jobs are going to continue to be in high demand so long as they continue to provision scarce, non-monetary goods. See this part of the post:
And I’m agnostic about whether that’s a good thing or a bad thing.
“I don’t think I’m assuming that.”
That’s fair—my bad.
I think that it felt worthwhile making this point because an obvious response to your conclusion that “demand for jobs at professional EA organizations will continue to be very high” is to not worry if demand for these jobs drops. Or one could go further, and think that it would be good if demand dropped, given that there are costs to being an unsuccessful applicant. I appreciate that you’re agnostic on whether people should have that response, but I personally think it would be bad—in part due to the reasoning in my previous comment.