I think this is an excellent point. Something I’d like to write into a forum post someday if I get any actual conclusions is that EA seems to have some difficulties that are inherent in the mathematical realities of the movement.
On the one hand, EA wants to grow and advocate more publicly. This makes sense and is a good thing for a movement. While EA definitely focuses on slower, more sustainable growth, my understanding is that when EA wants quality, they mean quality of fit moreso than explicitly targeting quality of talent/money/resources. We want people aligned with the movement—it’s okay if they aren’t hugely influential in their fields.
On the other hand...EA is essentially a love letter to the Pareto principle. The guiding principle of EA is that some interventions are much, MUCH more effective than others. The unfortunate truth is that in many fields, the same applies to not just organisations, but people. One Sam Bankman-Fried makes the impact of thousands or tens of thousands of “ordinary” people. And even then, an “ordinary” person is someone who makes a median or above-median income in a wealthy country and donates 10% of it per year—even THIS is not a low bar to cross!
Research productivity isn’t quite as bad as this, but there are absolutely some people who have way more impact than others. EA also badly needs people, but it badly needs people who meet a certain talent bar. Once again, the Pareto principle comes into play. If an EA organisation wants to double its headcount, one might think “Oh, it’ll be easy to get a job there!” And yet, often the objective standard is very high. It might be very easy to find a job if you meet that standard (https://www.lesswrong.com/posts/YDF7XhMThhNfHfim9/ai-safety-needs-great-engineers for an example) but meeting that standard is HARD.
The final issue is this. Earning-to-give and donating 10% to charity is a totally reasonable path that can save dozens or hundreds of lives. Basically any effective altruist would say this is a totally good thing, you should be proud of it, and it is a totally worthy contribution to the cause. But if you engage with EA materials, you will hear about this a few times, and you will hear far more about other causes. Why is that? Because...there’s just not that much to say about it, really. Once you’ve gotten the infrastructure of research up (which charities are effective, where do I donate), the only updates in this area tend to be “Here’s a way to advocate to get other people to give more”, “Let’s celebrate X Day”, and “Hey, GiveWell found a new effective charity!”. If we wanted to make a “10%-er post” every week, I feel like we’d run out of content pretty quickly.
Therefore, when most people engage with places like the EA forum, they get people talking about things that probably aren’t relevant to them. Most people are talking about things that are less obvious, still being fleshed out, or require specific, often highly niche and difficult-to-obtain talents. This isn’t because EA is elitist and is deliberately shutting out the plebians who don’t want to devote their whole career to EA, it’s because these are the areas where new content is needed, and where new content won’t repeat itself.
I think this is an excellent point. Something I’d like to write into a forum post someday if I get any actual conclusions is that EA seems to have some difficulties that are inherent in the mathematical realities of the movement.
On the one hand, EA wants to grow and advocate more publicly. This makes sense and is a good thing for a movement. While EA definitely focuses on slower, more sustainable growth, my understanding is that when EA wants quality, they mean quality of fit moreso than explicitly targeting quality of talent/money/resources. We want people aligned with the movement—it’s okay if they aren’t hugely influential in their fields.
On the other hand...EA is essentially a love letter to the Pareto principle. The guiding principle of EA is that some interventions are much, MUCH more effective than others. The unfortunate truth is that in many fields, the same applies to not just organisations, but people. One Sam Bankman-Fried makes the impact of thousands or tens of thousands of “ordinary” people. And even then, an “ordinary” person is someone who makes a median or above-median income in a wealthy country and donates 10% of it per year—even THIS is not a low bar to cross!
Research productivity isn’t quite as bad as this, but there are absolutely some people who have way more impact than others. EA also badly needs people, but it badly needs people who meet a certain talent bar. Once again, the Pareto principle comes into play. If an EA organisation wants to double its headcount, one might think “Oh, it’ll be easy to get a job there!” And yet, often the objective standard is very high. It might be very easy to find a job if you meet that standard (https://www.lesswrong.com/posts/YDF7XhMThhNfHfim9/ai-safety-needs-great-engineers for an example) but meeting that standard is HARD.
The final issue is this. Earning-to-give and donating 10% to charity is a totally reasonable path that can save dozens or hundreds of lives. Basically any effective altruist would say this is a totally good thing, you should be proud of it, and it is a totally worthy contribution to the cause. But if you engage with EA materials, you will hear about this a few times, and you will hear far more about other causes. Why is that? Because...there’s just not that much to say about it, really. Once you’ve gotten the infrastructure of research up (which charities are effective, where do I donate), the only updates in this area tend to be “Here’s a way to advocate to get other people to give more”, “Let’s celebrate X Day”, and “Hey, GiveWell found a new effective charity!”. If we wanted to make a “10%-er post” every week, I feel like we’d run out of content pretty quickly.
Therefore, when most people engage with places like the EA forum, they get people talking about things that probably aren’t relevant to them. Most people are talking about things that are less obvious, still being fleshed out, or require specific, often highly niche and difficult-to-obtain talents. This isn’t because EA is elitist and is deliberately shutting out the plebians who don’t want to devote their whole career to EA, it’s because these are the areas where new content is needed, and where new content won’t repeat itself.
I don’t yet have any suggestions about this.