“5%+ of unrestricted EA talent and funding should be focused on the potential well-being of future artificial intelligence systems”.
As a rough estimate for the number of EAs, I take the number of GWWC Pledgers even if they’d consider themselves ‘EA-Adjacent’.[2] At my last check, the lifetime members page stated there were 8,983 members, so 5% of that would be ~449 EAs working specifically or primarily on the potential well-being of future artificial intelligence systems.
This seems to me to be too expansive an operationalization of “EA talent”.
If we’re talking about how to allocate EA talent, it doesn’t seem to be that it can be ‘all EAs’ or even all GWWC pledgers. Many of these people will be retired or earning to give, or unable to contribute to EA direct work for some other reason. Many don’t even intend to do EA direct work. And many of those who are doing EA direct work will be doing ops or other meta work, so even the EA direct work total is not the total number who could be directly working as AI welfare researchers. I think, if we use this bar, then most EA cause areas won’t reach 5% of EA talent.
In a previous survey, we found 8.7% of respondents worked in an EA org. This is likely an overestimate, because fewer less engaged EAs (who are less likely to take the survey) are EA org employees). 8.7% of the total EA community (assuming growth based on the method we employed in 2019 and 2020 implies around 1300 people in EA orgs (5% of which would be around 67 people). We get a similar estimate from applying the method above to the total number of people who reported working in EA orgs in 2022. To be sure, the number of people who are in specifically EA orgs will undercount total talent, since some people are doing direct work outside EA orgs. But using the 2022 numbers for people reporting they are doing direct work, would only increase the 5% figure to around 114 (which I argue would still need to be discounted for people doing ops and similar work, if we want to estimate how many people should be doing AI welfare work specifically).
This seems to me to be too expansive an operationalization of “EA talent”.
If we’re talking about how to allocate EA talent, it doesn’t seem to be that it can be ‘all EAs’ or even all GWWC pledgers. Many of these people will be retired or earning to give, or unable to contribute to EA direct work for some other reason. Many don’t even intend to do EA direct work. And many of those who are doing EA direct work will be doing ops or other meta work, so even the EA direct work total is not the total number who could be directly working as AI welfare researchers. I think, if we use this bar, then most EA cause areas won’t reach 5% of EA talent.
In a previous survey, we found 8.7% of respondents worked in an EA org. This is likely an overestimate, because fewer less engaged EAs (who are less likely to take the survey) are EA org employees). 8.7% of the total EA community (assuming growth based on the method we employed in 2019 and 2020 implies around 1300 people in EA orgs (5% of which would be around 67 people). We get a similar estimate from applying the method above to the total number of people who reported working in EA orgs in 2022. To be sure, the number of people who are in specifically EA orgs will undercount total talent, since some people are doing direct work outside EA orgs. But using the 2022 numbers for people reporting they are doing direct work, would only increase the 5% figure to around 114 (which I argue would still need to be discounted for people doing ops and similar work, if we want to estimate how many people should be doing AI welfare work specifically).