I think this is an excellent point. Something Iâd like to write into a forum post someday if I get any actual conclusions is that EA seems to have some difficulties that are inherent in the mathematical realities of the movement.
On the one hand, EA wants to grow and advocate more publicly. This makes sense and is a good thing for a movement. While EA definitely focuses on slower, more sustainable growth, my understanding is that when EA wants quality, they mean quality of fit moreso than explicitly targeting quality of talent/âmoney/âresources. We want people aligned with the movementâitâs okay if they arenât hugely influential in their fields.
On the other hand...EA is essentially a love letter to the Pareto principle. The guiding principle of EA is that some interventions are much, MUCH more effective than others. The unfortunate truth is that in many fields, the same applies to not just organisations, but people. One Sam Bankman-Fried makes the impact of thousands or tens of thousands of âordinaryâ people. And even then, an âordinaryâ person is someone who makes a median or above-median income in a wealthy country and donates 10% of it per yearâeven THIS is not a low bar to cross!
Research productivity isnât quite as bad as this, but there are absolutely some people who have way more impact than others. EA also badly needs people, but it badly needs people who meet a certain talent bar. Once again, the Pareto principle comes into play. If an EA organisation wants to double its headcount, one might think âOh, itâll be easy to get a job there!â And yet, often the objective standard is very high. It might be very easy to find a job if you meet that standard (https://ââwww.lesswrong.com/ââposts/ââYDF7XhMThhNfHfim9/ââai-safety-needs-great-engineers for an example) but meeting that standard is HARD.
The final issue is this. Earning-to-give and donating 10% to charity is a totally reasonable path that can save dozens or hundreds of lives. Basically any effective altruist would say this is a totally good thing, you should be proud of it, and it is a totally worthy contribution to the cause. But if you engage with EA materials, you will hear about this a few times, and you will hear far more about other causes. Why is that? Because...thereâs just not that much to say about it, really. Once youâve gotten the infrastructure of research up (which charities are effective, where do I donate), the only updates in this area tend to be âHereâs a way to advocate to get other people to give moreâ, âLetâs celebrate X Dayâ, and âHey, GiveWell found a new effective charity!â. If we wanted to make a â10%-er postâ every week, I feel like weâd run out of content pretty quickly.
Therefore, when most people engage with places like the EA forum, they get people talking about things that probably arenât relevant to them. Most people are talking about things that are less obvious, still being fleshed out, or require specific, often highly niche and difficult-to-obtain talents. This isnât because EA is elitist and is deliberately shutting out the plebians who donât want to devote their whole career to EA, itâs because these are the areas where new content is needed, and where new content wonât repeat itself.
I think this is an excellent point. Something Iâd like to write into a forum post someday if I get any actual conclusions is that EA seems to have some difficulties that are inherent in the mathematical realities of the movement.
On the one hand, EA wants to grow and advocate more publicly. This makes sense and is a good thing for a movement. While EA definitely focuses on slower, more sustainable growth, my understanding is that when EA wants quality, they mean quality of fit moreso than explicitly targeting quality of talent/âmoney/âresources. We want people aligned with the movementâitâs okay if they arenât hugely influential in their fields.
On the other hand...EA is essentially a love letter to the Pareto principle. The guiding principle of EA is that some interventions are much, MUCH more effective than others. The unfortunate truth is that in many fields, the same applies to not just organisations, but people. One Sam Bankman-Fried makes the impact of thousands or tens of thousands of âordinaryâ people. And even then, an âordinaryâ person is someone who makes a median or above-median income in a wealthy country and donates 10% of it per yearâeven THIS is not a low bar to cross!
Research productivity isnât quite as bad as this, but there are absolutely some people who have way more impact than others. EA also badly needs people, but it badly needs people who meet a certain talent bar. Once again, the Pareto principle comes into play. If an EA organisation wants to double its headcount, one might think âOh, itâll be easy to get a job there!â And yet, often the objective standard is very high. It might be very easy to find a job if you meet that standard (https://ââwww.lesswrong.com/ââposts/ââYDF7XhMThhNfHfim9/ââai-safety-needs-great-engineers for an example) but meeting that standard is HARD.
The final issue is this. Earning-to-give and donating 10% to charity is a totally reasonable path that can save dozens or hundreds of lives. Basically any effective altruist would say this is a totally good thing, you should be proud of it, and it is a totally worthy contribution to the cause. But if you engage with EA materials, you will hear about this a few times, and you will hear far more about other causes. Why is that? Because...thereâs just not that much to say about it, really. Once youâve gotten the infrastructure of research up (which charities are effective, where do I donate), the only updates in this area tend to be âHereâs a way to advocate to get other people to give moreâ, âLetâs celebrate X Dayâ, and âHey, GiveWell found a new effective charity!â. If we wanted to make a â10%-er postâ every week, I feel like weâd run out of content pretty quickly.
Therefore, when most people engage with places like the EA forum, they get people talking about things that probably arenât relevant to them. Most people are talking about things that are less obvious, still being fleshed out, or require specific, often highly niche and difficult-to-obtain talents. This isnât because EA is elitist and is deliberately shutting out the plebians who donât want to devote their whole career to EA, itâs because these are the areas where new content is needed, and where new content wonât repeat itself.
I donât yet have any suggestions about this.