I agree having these roles filled is still very valuable for the world and would continue to be so at higher wages. My worry is from seeing what candidates next best alternative option is for other jobs. I worry that EA jobs are too good a deal e.g better benefits, better salary and more impact. When one or a couple of those would be enough to motivate someone into that job. As you mention this won’t be as true for some types of roles such as operations of computer sciences roles where transfering the higher paid ‘normal’ jobs is easier.
I don’t know wether nonprofit employees deserve more is a relevant question. As that’s more subjective and comes at the cost of the organisations beneficiaries (if deservingness is the goal)
I worry that EA jobs are too good a deal e.g better benefits, better salary and more impact. When one or a couple of those would be enough to motivate someone into that job.
Imagine if Google [edit: in its early high-growth phase] said something like this—our company is so impactful that we ought to pay a salary far below the industry standard, to avoid making our job offers “too good”. Clearly this is wrong. Yes, there is an effect in this direction. But if you stoop down to nonprofit salaries, you will lose more from being unable to recruit selfish talent, than you would lose from overpaying the altruistic talent.
Note also that if the talent is truly as fully altruistic as they would have to be for your logic to work out, they could negotiate their salary down, or donate it on, so the cost of overpaying them should be quite small indeed.
Agree with your conclusion but I don’t see the Google analogy. Google doesn’t expect its employees to be prosociallly or impact motivated. And what is a good decision logic for maximising Google profits might correspond to a terrible logic for an EA org to follow. E.g., unpredictable product rollouts to confuse the competition, trying to lock in markets and systems.
Sorry, I was picturing an early-stage Google that could expect their staff to be at least a bit altruistic. They had a giant ratio of users to staff, such that each staff member genuinely would have an enormous positive impact, and growth and impact were aligned at least somewhat.
I agree having these roles filled is still very valuable for the world and would continue to be so at higher wages. My worry is from seeing what candidates next best alternative option is for other jobs. I worry that EA jobs are too good a deal e.g better benefits, better salary and more impact. When one or a couple of those would be enough to motivate someone into that job. As you mention this won’t be as true for some types of roles such as operations of computer sciences roles where transfering the higher paid ‘normal’ jobs is easier.
I don’t know wether nonprofit employees deserve more is a relevant question. As that’s more subjective and comes at the cost of the organisations beneficiaries (if deservingness is the goal)
Imagine if Google [edit: in its early high-growth phase] said something like this—our company is so impactful that we ought to pay a salary far below the industry standard, to avoid making our job offers “too good”. Clearly this is wrong. Yes, there is an effect in this direction. But if you stoop down to nonprofit salaries, you will lose more from being unable to recruit selfish talent, than you would lose from overpaying the altruistic talent.
Note also that if the talent is truly as fully altruistic as they would have to be for your logic to work out, they could negotiate their salary down, or donate it on, so the cost of overpaying them should be quite small indeed.
Agree with your conclusion but I don’t see the Google analogy. Google doesn’t expect its employees to be prosociallly or impact motivated. And what is a good decision logic for maximising Google profits might correspond to a terrible logic for an EA org to follow. E.g., unpredictable product rollouts to confuse the competition, trying to lock in markets and systems.
Sorry, I was picturing an early-stage Google that could expect their staff to be at least a bit altruistic. They had a giant ratio of users to staff, such that each staff member genuinely would have an enormous positive impact, and growth and impact were aligned at least somewhat.