Very quick comment: I think I feel this intuition, but when I step back, I’m not sure why potential to contribute via donations should reduce more slowly with ‘ability’ than potential to contribute in other ways.
If anything, income seems to be unusually heavy-tailed compared to direct work (the top two donors in EA account for the majority of the capital, but I don’t think the top 2 direct workers account for the majority of the value of the labour).
I wonder if people who can’t do the top direct work jobs wouldn’t be able to have more impact by working in government, spreading ideas / community building / generally being sensible advocates for EA ideas, taking supporting roles at non-profits, and maybe other things.
Would be curious for more thoughts on how this works.
If anything, income seems to be unusually heavy-tailed compared to direct work (the top two donors in EA account for the majority of the capital, but I don’t think the top 2 direct workers account for the majority of the value of the labour).
Although I think this stylized fact remains interesting, I wonder if there’s an ex-ante/ ex-post issue lurking here. You get to see the endpoint with money a lot earlier than direct work contributions, and there’s probably a lot of lottery-esque dynamics. I’d guess these as corollaries:
First, the ex ante ‘expected $ raised’ from folks aiming at E2G (e.g. at a similar early career stage) is much more even than the ex post distribution. Algo-trader Alice and Entrepreneur Edward may have similar expected lifetime income, but Edward has much higher variance—ditto one of entrepreneur Edward and Edith may swamp the other if one (but not the other) hits the jackpot.
Second, part of the reason direct work contributions look more even is this is largely an ex ante estimate—a clairvoyant ex post assessment would likely be much more starkly skewed. E.g. If work on AI paradigm X alone was sufficient to avert existential catastrophe (which turned out to be the only such danger), the impact of the lead researcher(s) re. X is astronomically larger than everything else everyone else is doing.
Third, I also wonder that raw $ value may mislead in credit assignment for donation impact. The entrepreneur who makes a billion $ company hasn’t done all the work themselves, and it’s facially plausible some shapley/whatever credit sharing between these founders and (e.g.) current junior staff would not be as disproportionate as the money which ends up in their respective bank accounts.
Maybe not: perhaps the reward in terms of ‘getting things off the ground’, taking lots of risk, etc. do mean the tech founder megadonor bucks should be attributed ~wholly to them. But similar reasoning could be applied to direct work as well. Perhaps the lion’s share of all contributions for global health work up to now should be accorded to (e.g.) Peter Singer, as all subsequent work is essentially ‘footnotes to Famine, Affluence, and Morality’; or AI work to those who toiled in the vineyards over a decade ago, even if now their work is a much smaller proportion of the contemporary aggregate contribution.
Re your third point: I find it plausible that both startup earnings and explicit allocation of research insight can to at least some degree be modeled as a tournament for “being first/best,” which means you have a pretty extreme distribution if you are trying to win resources (hopefully for altruism) like $s or prestige, but a much less extreme distribution if we’re trying to estimate actual good done while trying to spend down such resources.
Put another way, I find it farcical to think that Newton should get >20% of the credit for inventing calculus (given both the example of Leibniz and that many of the ideas were floating around at the time), probably not even >5%, yet I get the distinct impression (never checked with polling or anything) that many people would attribute the invention of calculus solely or mostly to Newton.
Similarly, there are two importantly different applied ethics questions to ask whether it’s correct to give billionaires billions of dollars to their work vs whether individuals should try to make billions of dollars to donate.
I am still confused whether you are talking about full-time work. I’d very much hope a full-time community builder produces more value than a donation of a couple of thousand dollars to the EA Funds.
But if you are not discussing full-time work and instead part-time activities like occasionally hosting dinners on EA related themes it makes sense to compare this to 10% donations (though I also don’t know why you are evaluating 10% donations at ~$2000, median salary in most rich countries is more than 10 times that).
But then it doesn’t make sense to compare the 10% donations and part-time activities to the very demanding direct work paths (e.g. AI safety research). Donating $2000 (or generally 10%, unless they are poor) requires way less dedication than fully focussing your career on a top priority path.
Someone who would be dedicated enough to pursue a priority path but is unable to should in many cases be able to donate way more than $2000. Let’s say they are “only” in the 90th percentile for ability in a rich country and will draw a 90th percentile salary, which is above £50,000 in the UK (source). If they have the same dedication level as someone in a top priority path they should be able to donate ~£15,000 of that. That is 10 times as much as $2000!
I was thinking of donating 10% vs. some part time work / side projects.
I agree that someone with the altruism willing to donate say 50% of their income but who isn’t able to get a top direct work job could donate more like $10k - $100k per year (depending on their earning potential, which might be high if they’re willing to do something like real estate, sales or management in a non-glamorous business).
Though I still feel like there’s a good chance there’s someone that dedicated and able could find something that produces more impact than that, given the funding situation.
I think I might prefer to have another EA civil servant than $50k per year, even if not in an especially influential position. Or I might prefer them to optimise for having a good network and then talking about EA ideas.
The first thing that comes to mind here is that replaceability is a concern for direct work, but not for donations. Previously, the argument has been that replaceability does not matter as much for the very high impact roles as they are likely heavy tailed and therefore the gap between the first and second applicant large.
But that is not true anymore once you leave the tails, you get the full impact from donations but less impact from direct work due to replaceability concerns. This also makes me a bit confused about your statement that income is unusually heavy-tailed compared to direct work—possibly, but I am specifically not talking about the tails, but about everyone who isn’t in the top ~3% for “ability”.
Or looking at this differently: for the top few percent we think they should try to have their impact via direct work first. But it seems pretty clear (at least I think so?) that a person in the bottom 20% percentile in a rich country should try to maximise income to donate instead of direct work.
The crossover point where one should switch from focusing on direct work instead of donations therefore needs to be somewhere between the 20% and 97%. It is entirely possible that it is pretty low on that curve and admittedly most people interested in EA are above average in ability, but the crossover point has to be somewhere and then we need to figure out where.
For working in government policy I also expect only the top ~3% in ability have a shot at highly impactful roles or are able to shape their role in an impactful way outside of their job description. When you talk about advocacy I am not sure whether you still mean full-time roles. If so, I find it plausible that you do not need to be in the top ~3% for community building roles, but that is mostly because we have plenty of geographical areas where noone is working on EA community building full-time, which lowers the bar for having an impact.
Very quick comment: I think I feel this intuition, but when I step back, I’m not sure why potential to contribute via donations should reduce more slowly with ‘ability’ than potential to contribute in other ways.
If anything, income seems to be unusually heavy-tailed compared to direct work (the top two donors in EA account for the majority of the capital, but I don’t think the top 2 direct workers account for the majority of the value of the labour).
I wonder if people who can’t do the top direct work jobs wouldn’t be able to have more impact by working in government, spreading ideas / community building / generally being sensible advocates for EA ideas, taking supporting roles at non-profits, and maybe other things.
Would be curious for more thoughts on how this works.
Although I think this stylized fact remains interesting, I wonder if there’s an ex-ante/ ex-post issue lurking here. You get to see the endpoint with money a lot earlier than direct work contributions, and there’s probably a lot of lottery-esque dynamics. I’d guess these as corollaries:
First, the ex ante ‘expected $ raised’ from folks aiming at E2G (e.g. at a similar early career stage) is much more even than the ex post distribution. Algo-trader Alice and Entrepreneur Edward may have similar expected lifetime income, but Edward has much higher variance—ditto one of entrepreneur Edward and Edith may swamp the other if one (but not the other) hits the jackpot.
Second, part of the reason direct work contributions look more even is this is largely an ex ante estimate—a clairvoyant ex post assessment would likely be much more starkly skewed. E.g. If work on AI paradigm X alone was sufficient to avert existential catastrophe (which turned out to be the only such danger), the impact of the lead researcher(s) re. X is astronomically larger than everything else everyone else is doing.
Third, I also wonder that raw $ value may mislead in credit assignment for donation impact. The entrepreneur who makes a billion $ company hasn’t done all the work themselves, and it’s facially plausible some shapley/whatever credit sharing between these founders and (e.g.) current junior staff would not be as disproportionate as the money which ends up in their respective bank accounts.
Maybe not: perhaps the reward in terms of ‘getting things off the ground’, taking lots of risk, etc. do mean the tech founder megadonor bucks should be attributed ~wholly to them. But similar reasoning could be applied to direct work as well. Perhaps the lion’s share of all contributions for global health work up to now should be accorded to (e.g.) Peter Singer, as all subsequent work is essentially ‘footnotes to Famine, Affluence, and Morality’; or AI work to those who toiled in the vineyards over a decade ago, even if now their work is a much smaller proportion of the contemporary aggregate contribution.
Re your third point: I find it plausible that both startup earnings and explicit allocation of research insight can to at least some degree be modeled as a tournament for “being first/best,” which means you have a pretty extreme distribution if you are trying to win resources (hopefully for altruism) like $s or prestige, but a much less extreme distribution if we’re trying to estimate actual good done while trying to spend down such resources.
Put another way, I find it farcical to think that Newton should get >20% of the credit for inventing calculus (given both the example of Leibniz and that many of the ideas were floating around at the time), probably not even >5%, yet I get the distinct impression (never checked with polling or anything) that many people would attribute the invention of calculus solely or mostly to Newton.
Similarly, there are two importantly different applied ethics questions to ask whether it’s correct to give billionaires billions of dollars to their work vs whether individuals should try to make billions of dollars to donate.
That makes sense, thanks for the comment.
I think you’re right looking at ex post doesn’t tell us that much.
If I try to make ex ante estimates, then I’d put someone pledging 10% at a couple of thousand dollars per year to the EA Funds or equivalent.
But I’d probably also put similar (or higher) figures on the value of the other ways of contributing above.
I am still confused whether you are talking about full-time work. I’d very much hope a full-time community builder produces more value than a donation of a couple of thousand dollars to the EA Funds.
But if you are not discussing full-time work and instead part-time activities like occasionally hosting dinners on EA related themes it makes sense to compare this to 10% donations (though I also don’t know why you are evaluating 10% donations at ~$2000, median salary in most rich countries is more than 10 times that).
But then it doesn’t make sense to compare the 10% donations and part-time activities to the very demanding direct work paths (e.g. AI safety research). Donating $2000 (or generally 10%, unless they are poor) requires way less dedication than fully focussing your career on a top priority path.
Someone who would be dedicated enough to pursue a priority path but is unable to should in many cases be able to donate way more than $2000. Let’s say they are “only” in the 90th percentile for ability in a rich country and will draw a 90th percentile salary, which is above £50,000 in the UK (source). If they have the same dedication level as someone in a top priority path they should be able to donate ~£15,000 of that. That is 10 times as much as $2000!
I was thinking of donating 10% vs. some part time work / side projects.
I agree that someone with the altruism willing to donate say 50% of their income but who isn’t able to get a top direct work job could donate more like $10k - $100k per year (depending on their earning potential, which might be high if they’re willing to do something like real estate, sales or management in a non-glamorous business).
Though I still feel like there’s a good chance there’s someone that dedicated and able could find something that produces more impact than that, given the funding situation.
I think I might prefer to have another EA civil servant than $50k per year, even if not in an especially influential position. Or I might prefer them to optimise for having a good network and then talking about EA ideas.
Thank you for providing more colour on your view, that’s useful!
The first thing that comes to mind here is that replaceability is a concern for direct work, but not for donations. Previously, the argument has been that replaceability does not matter as much for the very high impact roles as they are likely heavy tailed and therefore the gap between the first and second applicant large.
But that is not true anymore once you leave the tails, you get the full impact from donations but less impact from direct work due to replaceability concerns. This also makes me a bit confused about your statement that income is unusually heavy-tailed compared to direct work—possibly, but I am specifically not talking about the tails, but about everyone who isn’t in the top ~3% for “ability”.
Or looking at this differently: for the top few percent we think they should try to have their impact via direct work first. But it seems pretty clear (at least I think so?) that a person in the bottom 20% percentile in a rich country should try to maximise income to donate instead of direct work. The crossover point where one should switch from focusing on direct work instead of donations therefore needs to be somewhere between the 20% and 97%. It is entirely possible that it is pretty low on that curve and admittedly most people interested in EA are above average in ability, but the crossover point has to be somewhere and then we need to figure out where.
For working in government policy I also expect only the top ~3% in ability have a shot at highly impactful roles or are able to shape their role in an impactful way outside of their job description. When you talk about advocacy I am not sure whether you still mean full-time roles. If so, I find it plausible that you do not need to be in the top ~3% for community building roles, but that is mostly because we have plenty of geographical areas where noone is working on EA community building full-time, which lowers the bar for having an impact.