Unfortunately, there is an incentive for EA organizations to compete for candidates based on their impact value rather than market value. Each organization can increase its impact by offering better incentives or salaries to attract the most talented EAs.
Paying in proportion to impact is not unfortunate at all. More impactful jobs should be paid higher, so that workers who follow that incentive gradient end up at the most impactful roles.
True. But not useful. Salary is a very very very weak signal of impact and a high salary could mean many many other things.
For example here are some things a high salary could also be a signal of, many of which might suggest that following a high salary is NOT a good idea. A high salary might be a sign that:
An org is managing its resources poorly (I know some folk who think their orgs overspend needlessly).
An org is failing to hire, which could be a sign of internal problems or bad management.
An org is more established. Newer orgs may struggle to demonstrate impact so get less funds, but if you buy the value of more entrepreneurship in EA it might be better to find a newer org with a lower salary.
An org is a funding org. Funders tend to pay higher than doers. If you think doers are more needed in EA right now then you shouldn’t follow the salaries to funders
An org is a longtermist org. These tend to pay higher as the existing orgs in the space are not very scalable or good at absorbing funds yet. I think folks should judge cause areas on factors other than salaries.
There would be lower counterfactual impact of taking the job. If you are not salary motivated but think other folk are a higher salary might be a sign that your counterfactual impact will be lower compared to other applicants.
Salary is an extremely messy signal, with many factors affecting it. I would recommend folk in EA do not overly update for or against a job based on its salary.
At least from a personal perspective, if I see an EA job with a super high salary, I sometimes catch myself thinking: the salary is so high, they will definitely find someone for that job, so I would rather work at a less established org that can only offer a lower salary. So my brain sees lower salaries as a signal of higher counterfactual impact. Not sure this is a great way to think (I don’t think folk should judge impact based on salary, and I might just be confabulating to justify my past decision making). But if a fair amount of other folk do think like that then orgs that compete for altruism with high salaries might be shooting themselves in the foot.
This is not at all clear to me. Like, I can process the considerations that you’ve written, but it’s not clear to me that they overweigh the market signal, and I would still lean towards no.
There are other considerations that complicate the analysis. For example, it creates an incentive for grant seekers to request much more money than they need. Someone who is motivated by the expectation of impact and only needs money to cover basic expenses should, other things equal, not be put in a situation where they feel they need to ask for way more money or else create the perception that their work is low value.
That’s true, it’s more I think it can lead to poor incentives for organization competition and we could achieve the same gradient at a much lower level, closer to candidates ‘normal’ job counterfactuals.
I’m also proberly more skeptical than the average EA about how good a signal funding is for org impact. Even in EA I would bet a charismatic well networked fundraiser does more for an organizations funding than improving the organization (with this being more true short term and for an equivalent cost)
That’s true, it’s more I think it can lead to poor incentives for organization competition
How? If organisations try to convince funders of their impact so that they can pay bigger salaries, this is good prima facie.
we could achieve the same gradient at a much lower level, closer to candidates ‘normal’ job counterfactuals.
But most EAs could earn more outside of EA than they do at EA organisations. Note that earn-to-give EAs tend to have more funds available for personal consumption than those doing direct work, even after their donations. I think it is not unreasonable for there to be some difference, because direct work is often fulfilling and has other perks, but we shouldn’t delude ourselves that direct work pays more.
Good point and I agree for some roles such for technical AI safety researchers. Being a recent graduate I can see what my EA and non-EA friends can get in the job market and my some of my EA friends are better compensated. It’s possible my EA friends are more competent and could command a higher wage but I don’t get that impression. For a more solid case study of what i’m taking about: Office Manager—New York EA Hub : $85,000 - $100,000 Office Manager Salaries in New York from Glassdoor: ~$55,000
I agree that some EA jobs in ops pay above market. But there has been a community-wide shortage of ops staff for over five years, so the salaries help retain people who would otherwise go for jobs outside EA. Often people’s alternative jobs are pretty good too. Note that an office manager at Google makes $85k. And some ops staff in EA have even stronger alternatives—working as a consultant, a programmer, a product manager. Plus this particular job is probably a pretty difficulty office manager job (working without a big local team, to set up a new office) And there has been recent inflation.
Also, the quoted passage seems to assume that EA orgs optimize for their org’s impact rather than for the impact of the movement/good of the world. I’m not convinced that’s true. I would be surprised if EA orgs were attempting to poach workers they explicitly believed were having more impact at other organizations.
It does seem possible that orgs overestimate their own impact/the impact of roles they hire for. However, this would still lead to a much smaller effect than if they completely ignore the impact of candidates at their current roles, as the post seems to assume.
I think EA orgs are probably most sensitive to this issue of any in the world, but there still is probably some amount of ‘wanting your own org to be the ones having the impact and the power’; hard to completely eliminate this part of human nature.
Paying in proportion to impact is not unfortunate at all. More impactful jobs should be paid higher, so that workers who follow that incentive gradient end up at the most impactful roles.
+1; higher salary is a hard-to-fake market signal of funders thinking that the job is valuable.
True. But not useful. Salary is a very very very weak signal of impact and a high salary could mean many many other things.
For example here are some things a high salary could also be a signal of, many of which might suggest that following a high salary is NOT a good idea. A high salary might be a sign that:
An org is managing its resources poorly (I know some folk who think their orgs overspend needlessly).
An org is failing to hire, which could be a sign of internal problems or bad management.
An org is more established. Newer orgs may struggle to demonstrate impact so get less funds, but if you buy the value of more entrepreneurship in EA it might be better to find a newer org with a lower salary.
An org is a funding org. Funders tend to pay higher than doers. If you think doers are more needed in EA right now then you shouldn’t follow the salaries to funders
An org is a longtermist org. These tend to pay higher as the existing orgs in the space are not very scalable or good at absorbing funds yet. I think folks should judge cause areas on factors other than salaries.
There would be lower counterfactual impact of taking the job. If you are not salary motivated but think other folk are a higher salary might be a sign that your counterfactual impact will be lower compared to other applicants.
Salary is an extremely messy signal, with many factors affecting it. I would recommend folk in EA do not overly update for or against a job based on its salary.
At least from a personal perspective, if I see an EA job with a super high salary, I sometimes catch myself thinking: the salary is so high, they will definitely find someone for that job, so I would rather work at a less established org that can only offer a lower salary. So my brain sees lower salaries as a signal of higher counterfactual impact. Not sure this is a great way to think (I don’t think folk should judge impact based on salary, and I might just be confabulating to justify my past decision making). But if a fair amount of other folk do think like that then orgs that compete for altruism with high salaries might be shooting themselves in the foot.
Alternatively, the salary is so high because it is difficult to find someone for that job.
This is not at all clear to me. Like, I can process the considerations that you’ve written, but it’s not clear to me that they overweigh the market signal, and I would still lean towards no.
There are other considerations that complicate the analysis. For example, it creates an incentive for grant seekers to request much more money than they need. Someone who is motivated by the expectation of impact and only needs money to cover basic expenses should, other things equal, not be put in a situation where they feel they need to ask for way more money or else create the perception that their work is low value.
That’s true, it’s more I think it can lead to poor incentives for organization competition and we could achieve the same gradient at a much lower level, closer to candidates ‘normal’ job counterfactuals.
I’m also proberly more skeptical than the average EA about how good a signal funding is for org impact. Even in EA I would bet a charismatic well networked fundraiser does more for an organizations funding than improving the organization (with this being more true short term and for an equivalent cost)
How? If organisations try to convince funders of their impact so that they can pay bigger salaries, this is good prima facie.
But most EAs could earn more outside of EA than they do at EA organisations. Note that earn-to-give EAs tend to have more funds available for personal consumption than those doing direct work, even after their donations. I think it is not unreasonable for there to be some difference, because direct work is often fulfilling and has other perks, but we shouldn’t delude ourselves that direct work pays more.
Good point and I agree for some roles such for technical AI safety researchers. Being a recent graduate I can see what my EA and non-EA friends can get in the job market and my some of my EA friends are better compensated. It’s possible my EA friends are more competent and could command a higher wage but I don’t get that impression. For a more solid case study of what i’m taking about: Office Manager—New York EA Hub : $85,000 - $100,000
Office Manager Salaries in New York from Glassdoor: ~$55,000
I agree that some EA jobs in ops pay above market. But there has been a community-wide shortage of ops staff for over five years, so the salaries help retain people who would otherwise go for jobs outside EA. Often people’s alternative jobs are pretty good too. Note that an office manager at Google makes $85k. And some ops staff in EA have even stronger alternatives—working as a consultant, a programmer, a product manager. Plus this particular job is probably a pretty difficulty office manager job (working without a big local team, to set up a new office) And there has been recent inflation.
Also, the quoted passage seems to assume that EA orgs optimize for their org’s impact rather than for the impact of the movement/good of the world. I’m not convinced that’s true. I would be surprised if EA orgs were attempting to poach workers they explicitly believed were having more impact at other organizations.
It does seem possible that orgs overestimate their own impact/the impact of roles they hire for. However, this would still lead to a much smaller effect than if they completely ignore the impact of candidates at their current roles, as the post seems to assume.
I think EA orgs are probably most sensitive to this issue of any in the world, but there still is probably some amount of ‘wanting your own org to be the ones having the impact and the power’; hard to completely eliminate this part of human nature.