Another way to do this is like the rationality community does: Its highest status members are often pseudonymous internet writers with sometimes no visible credentials and sometimes active disdain for credentials (with the observation that argument screens off from authority).
Gwern has no (visible) credentials (unless you count the huge & excellent website as one), Yudkowsky disdains them, Scott Alexander sometimes brings them up, Applied Divinity Studies and dynomight and Fantastic Anachronism are all pseudonymous and probably prefer to keep it that way…
I think it’s much easier to be heard & respected in the EA community purely through online writing & content production (for which you “only” need intelligence, conscientiousness & time, but rarely connections) than in most other communities (and especially academia).
Not wrong but not helpful imo. (Past treatments of the theme: here, here, here.)
Main problem is you’re not considering the base rate for elitism / credentialism / privilege in the reference class “philanthropy / intellectual movements / technical fields / levers of power”. I’m first-generation college (and not elite college either), and I can tell you that my EA clients care the least about this among any class of clients (corporate, academic, government, non-EA philanthropy) by far.
Similarly: cmon, EA is 20% non-straight, as opposed to like 5% in the US.
It’s also just temporary founder effects plus previous lack of resources. One of the many boons of the funding influx is that we can start lifting unprivileged students. I’ve seen this happen ten times this year. There are lots of people trying to expand into Latin America and India. It’s hard!
To be honest “straight white male” feels a bit like a cached thought to me. Are you sure that the number of EAs who are gay or bi would be less than the population distribution or less than the population distribution among the educated classes? It wouldn’t surprise me if it were more.
Am I right in interpreting “cached thought” as, “something I think about a lot, and need to get out”? I seriously apologise if that offends you (and I mean that seriously, I’m not being sarcastic) but I find it hard to say that the faces of EA aren’t overwhelmingly white, wealthy males. You might be right on the LGBT point—I must confess that I’m new to the community and haven’t met anyone who self-identifies as an “effective altruist” yet. I’d still say that I think race and class background differences will alienate people if we keep going this way.
My point is that I’ve found people who come from leftist spaces sometimes have a tendency of saying a space is full of “straight white males” because that’s a catchy phrase without checking all three of the predicates to see if they match the particular context.
EA longterm needs to be less elitist, and it needs to stop trying to do so much recruiting at elite universities.
EA originally being elitist made sense.
One of the biggest reasons EA has gotten so powerful, so quickly is billionaires injecting a lot of money, but also because of elites. Specifically, the biggest constraint at the the 2010s era was even making a movement that could actually be effective, and since they didn’t have much money, effectiveness mattered most. And intelligence has very large effects because of heavy tails: the most intelligent people are disproportionately much more effective.
However, this constraint started to change once more people got introduced to Effective Altruism, and more money got into the ecosystem.
Also base rate neglect applies here.
That stated, the next decade of EA outreach needs to be less elitist than the last decade.
That short-term vs long-term distinction is really important. I agree that most major forces/movements start small (Facebook at Harvard, Apple at the Homebrew Computer Club), and that elite universities definitely would’ve been my pick of places to start.
Correct me if I’m reading into this wrong, but I think you’re also implicitly suggesting that a funding-constrained org should be elitist, and a talent-constrained org shouldn’t be. I think I agree with this, and finding talent in places we wouldn’t conventionally expect it to be is going to become increasingly important as the old sources of talent dry up.
Not exactly what I meant. Specifically, when the amount of people are the bottleneck, then yeah, don’t be elitist, you just hire people. It’s in the higher and more-variance impact that intelligence matters far more.
Basically: The higher the variance of impact, the more intelligence matters. The lower the variance, the less intelligence matters.
Another way to do this is like the rationality community does: Its highest status members are often pseudonymous internet writers with sometimes no visible credentials and sometimes active disdain for credentials (with the observation that argument screens off from authority).
Gwern has no (visible) credentials (unless you count the huge & excellent website as one), Yudkowsky disdains them, Scott Alexander sometimes brings them up, Applied Divinity Studies and dynomight and Fantastic Anachronism are all pseudonymous and probably prefer to keep it that way…
I think it’s much easier to be heard & respected in the EA community purely through online writing & content production (for which you “only” need intelligence, conscientiousness & time, but rarely connections) than in most other communities (and especially academia).
Not wrong but not helpful imo. (Past treatments of the theme: here, here, here.)
Main problem is you’re not considering the base rate for elitism / credentialism / privilege in the reference class “philanthropy / intellectual movements / technical fields / levers of power”. I’m first-generation college (and not elite college either), and I can tell you that my EA clients care the least about this among any class of clients (corporate, academic, government, non-EA philanthropy) by far.
Similarly: cmon, EA is 20% non-straight, as opposed to like 5% in the US.
It’s also just temporary founder effects plus previous lack of resources. One of the many boons of the funding influx is that we can start lifting unprivileged students. I’ve seen this happen ten times this year. There are lots of people trying to expand into Latin America and India. It’s hard!
To be honest “straight white male” feels a bit like a cached thought to me. Are you sure that the number of EAs who are gay or bi would be less than the population distribution or less than the population distribution among the educated classes? It wouldn’t surprise me if it were more.
Anecdotally, the EAs I know are more queer, MUCH more white, more male, and richer even compared to other people I know from the educated classes.
Am I right in interpreting “cached thought” as, “something I think about a lot, and need to get out”? I seriously apologise if that offends you (and I mean that seriously, I’m not being sarcastic) but I find it hard to say that the faces of EA aren’t overwhelmingly white, wealthy males. You might be right on the LGBT point—I must confess that I’m new to the community and haven’t met anyone who self-identifies as an “effective altruist” yet. I’d still say that I think race and class background differences will alienate people if we keep going this way.
Sorry, that’s not ever I meant. See the article here for a definition of cached thoughts: https://www.lesswrong.com/tag/cached-thoughts#:~:text=Cached Thoughts are ideas%2C attitudes,t re-evaluated since then.
My point is that I’ve found people who come from leftist spaces sometimes have a tendency of saying a space is full of “straight white males” because that’s a catchy phrase without checking all three of the predicates to see if they match the particular context.
I want to make 2 seemly contradictory claims:
EA longterm needs to be less elitist, and it needs to stop trying to do so much recruiting at elite universities.
EA originally being elitist made sense.
One of the biggest reasons EA has gotten so powerful, so quickly is billionaires injecting a lot of money, but also because of elites. Specifically, the biggest constraint at the the 2010s era was even making a movement that could actually be effective, and since they didn’t have much money, effectiveness mattered most. And intelligence has very large effects because of heavy tails: the most intelligent people are disproportionately much more effective.
However, this constraint started to change once more people got introduced to Effective Altruism, and more money got into the ecosystem.
Also base rate neglect applies here.
That stated, the next decade of EA outreach needs to be less elitist than the last decade.
That short-term vs long-term distinction is really important. I agree that most major forces/movements start small (Facebook at Harvard, Apple at the Homebrew Computer Club), and that elite universities definitely would’ve been my pick of places to start.
Correct me if I’m reading into this wrong, but I think you’re also implicitly suggesting that a funding-constrained org should be elitist, and a talent-constrained org shouldn’t be. I think I agree with this, and finding talent in places we wouldn’t conventionally expect it to be is going to become increasingly important as the old sources of talent dry up.
Not exactly what I meant. Specifically, when the amount of people are the bottleneck, then yeah, don’t be elitist, you just hire people. It’s in the higher and more-variance impact that intelligence matters far more.
Basically: The higher the variance of impact, the more intelligence matters. The lower the variance, the less intelligence matters.