EA longterm needs to be less elitist, and it needs to stop trying to do so much recruiting at elite universities.
EA originally being elitist made sense.
One of the biggest reasons EA has gotten so powerful, so quickly is billionaires injecting a lot of money, but also because of elites. Specifically, the biggest constraint at the the 2010s era was even making a movement that could actually be effective, and since they didn’t have much money, effectiveness mattered most. And intelligence has very large effects because of heavy tails: the most intelligent people are disproportionately much more effective.
However, this constraint started to change once more people got introduced to Effective Altruism, and more money got into the ecosystem.
Also base rate neglect applies here.
That stated, the next decade of EA outreach needs to be less elitist than the last decade.
That short-term vs long-term distinction is really important. I agree that most major forces/movements start small (Facebook at Harvard, Apple at the Homebrew Computer Club), and that elite universities definitely would’ve been my pick of places to start.
Correct me if I’m reading into this wrong, but I think you’re also implicitly suggesting that a funding-constrained org should be elitist, and a talent-constrained org shouldn’t be. I think I agree with this, and finding talent in places we wouldn’t conventionally expect it to be is going to become increasingly important as the old sources of talent dry up.
Not exactly what I meant. Specifically, when the amount of people are the bottleneck, then yeah, don’t be elitist, you just hire people. It’s in the higher and more-variance impact that intelligence matters far more.
Basically: The higher the variance of impact, the more intelligence matters. The lower the variance, the less intelligence matters.
I want to make 2 seemly contradictory claims:
EA longterm needs to be less elitist, and it needs to stop trying to do so much recruiting at elite universities.
EA originally being elitist made sense.
One of the biggest reasons EA has gotten so powerful, so quickly is billionaires injecting a lot of money, but also because of elites. Specifically, the biggest constraint at the the 2010s era was even making a movement that could actually be effective, and since they didn’t have much money, effectiveness mattered most. And intelligence has very large effects because of heavy tails: the most intelligent people are disproportionately much more effective.
However, this constraint started to change once more people got introduced to Effective Altruism, and more money got into the ecosystem.
Also base rate neglect applies here.
That stated, the next decade of EA outreach needs to be less elitist than the last decade.
That short-term vs long-term distinction is really important. I agree that most major forces/movements start small (Facebook at Harvard, Apple at the Homebrew Computer Club), and that elite universities definitely would’ve been my pick of places to start.
Correct me if I’m reading into this wrong, but I think you’re also implicitly suggesting that a funding-constrained org should be elitist, and a talent-constrained org shouldn’t be. I think I agree with this, and finding talent in places we wouldn’t conventionally expect it to be is going to become increasingly important as the old sources of talent dry up.
Not exactly what I meant. Specifically, when the amount of people are the bottleneck, then yeah, don’t be elitist, you just hire people. It’s in the higher and more-variance impact that intelligence matters far more.
Basically: The higher the variance of impact, the more intelligence matters. The lower the variance, the less intelligence matters.