The biggest danger to EA is being cool. I mean this completely seriously. When EA becomes cool, people who donât care about EA show up to secure money and status. When EA becomes cool, that reputation needs defending, which is often corrosive to truth. EA being unpopular enough to deter status-seekers while not being so universally loathed that even mission-aligned nerds hesitate to associate with it for fear of social repercussions is exactly where we ought to be.
I think the argument of âEA could achieve more if it had a better reputationâ is compelling, intuitive, and wrong. It seems like youâre imagining a cool version of EA that tons of smart people want to join, but also maintains the same level of mission alignment and commitment to truth. I think this is actually impossible.
EAâs impact is a product of magnitude * direction. A better reputation increases our magnitude, but itâs very easy for that direction to get much closer to zero. (And, since we take the money of committed altruists, a direction that is insufficiently positive is actually net-negative, thanks to opportunity cost)
I donât think âhaving a better reputationâ means âbeing coolâ.
I think âhaving a better reputationâ means people primarily associating EA with core ideas such as evidence, cost-effectiveness, impact, impartiality, counterfactual thinking (rather than e.g. FTX).
I suspect there are plenty of âmission-aligned nerdsâ out there who have been put off EA because they first hear about the bad stuff rather than the good stuff (though I also expect the overwhelming majority of âmission-aligned nerdsâ simply havenât heard of EA at all).
Iâd go further and say that the FTX-ish reputation âEA is where extremely wealthy Silicon Valley nerds brag about their generosity whilst mostly funnelling money to people like them and using it as an avenue for self-promotionâ also attracts the wrong sort of peopleâbefore there were people complaining about FTX being a scam, there were people complaining about the perceived ease of getting funding by FTX attracting the insincere.
(Other negative EA stereotypes contribute to putting well-intentioned people off, but Iâm not sure they actually attract the wrong people)
I think the magnitude Ă direction framing is really useful, and I agree the risk is real.
At EA Netherlands, Iâve been thinking about this through the lens of Ben Toddâs notion of âcommunity capitalâ â roughly, the stock of shared values, trust, human capital, coordination capacity, norm-following, and reputation that a community accumulates over time. The worry youâre describing is essentially that outreach erodes community capital. And it can â if you do it carelessly.
But my ambition is to try to monitor this over time. If we surveyed relevant aspects of the community periodically â tracking its values, the degree to which people are following community norms, the actions people are taking, the degree of interconnectedness between members, and the quality of human capital coming in â we might be able to detect whether outreach efforts are degrading the direction term, and course-correct if they are.[1]
If thatâs feasible, it turns a binary question (âshould we grow or not?â) into an empirical one (âis this particular form of growth maintaining alignment?â). Some forms of outreach might pass the test, and others might not. But youâd want to check rather than assume.
I think the implicit model in your comment is one where we have to choose â stay small and aligned, or grow and dilute. But perhaps thereâs a third option: invest seriously in both outreach and monitoring community capital, and course-correct over time.
It has been considered for the EA Survey! Iâm not sure why this was never prioritised for inclusion, after being raised. But if the meta orgs we work with say they want us to include these questions in the survey going forward, we will.
Good point. However, my claim is not that âEA should be coolâ and we should work to make it mainstream⊠I pretty much agree with you on that. My point is that EA should put more effort into building its own narrative to the general public (and it doesnât mean trying to make it look cool), otherwise it will be built by someone else, and the outcome will very likely not be beneficial for EA itself.
EA diluting its message to expand would result in more unqualified people applying for jobs on the EA jobs boards, which will make them worse job boards.
Although I donât think EA should go mainstream and have its message diluted, I think this statement is wrong. Unqualified and qualified applicants would have the same probability of stumbling across EA if it ever goes mainstream. I think this idea that âsmart people donât consume mainstream stuffâ is very wrong.
âWhen EA becomes cool, people who donât care about EA show up to secure money and status. When EA becomes cool, that reputation needs defending, which is often corrosive to truth.â
I disagree pretty strongly with this.
The biggest danger to EA is being cool. I mean this completely seriously. When EA becomes cool, people who donât care about EA show up to secure money and status. When EA becomes cool, that reputation needs defending, which is often corrosive to truth. EA being unpopular enough to deter status-seekers while not being so universally loathed that even mission-aligned nerds hesitate to associate with it for fear of social repercussions is exactly where we ought to be.
I think the argument of âEA could achieve more if it had a better reputationâ is compelling, intuitive, and wrong. It seems like youâre imagining a cool version of EA that tons of smart people want to join, but also maintains the same level of mission alignment and commitment to truth. I think this is actually impossible.
EAâs impact is a product of magnitude * direction. A better reputation increases our magnitude, but itâs very easy for that direction to get much closer to zero. (And, since we take the money of committed altruists, a direction that is insufficiently positive is actually net-negative, thanks to opportunity cost)
(You reminded me of this essay.)
I donât think âhaving a better reputationâ means âbeing coolâ.
I think âhaving a better reputationâ means people primarily associating EA with core ideas such as evidence, cost-effectiveness, impact, impartiality, counterfactual thinking (rather than e.g. FTX).
I suspect there are plenty of âmission-aligned nerdsâ out there who have been put off EA because they first hear about the bad stuff rather than the good stuff (though I also expect the overwhelming majority of âmission-aligned nerdsâ simply havenât heard of EA at all).
Iâd go further and say that the FTX-ish reputation âEA is where extremely wealthy Silicon Valley nerds brag about their generosity whilst mostly funnelling money to people like them and using it as an avenue for self-promotionâ also attracts the wrong sort of peopleâbefore there were people complaining about FTX being a scam, there were people complaining about the perceived ease of getting funding by FTX attracting the insincere.
(Other negative EA stereotypes contribute to putting well-intentioned people off, but Iâm not sure they actually attract the wrong people)
I think the magnitude Ă direction framing is really useful, and I agree the risk is real.
At EA Netherlands, Iâve been thinking about this through the lens of Ben Toddâs notion of âcommunity capitalâ â roughly, the stock of shared values, trust, human capital, coordination capacity, norm-following, and reputation that a community accumulates over time. The worry youâre describing is essentially that outreach erodes community capital. And it can â if you do it carelessly.
But my ambition is to try to monitor this over time. If we surveyed relevant aspects of the community periodically â tracking its values, the degree to which people are following community norms, the actions people are taking, the degree of interconnectedness between members, and the quality of human capital coming in â we might be able to detect whether outreach efforts are degrading the direction term, and course-correct if they are.[1]
If thatâs feasible, it turns a binary question (âshould we grow or not?â) into an empirical one (âis this particular form of growth maintaining alignment?â). Some forms of outreach might pass the test, and others might not. But youâd want to check rather than assume.
I think the implicit model in your comment is one where we have to choose â stay small and aligned, or grow and dilute. But perhaps thereâs a third option: invest seriously in both outreach and monitoring community capital, and course-correct over time.
Out of interest, has this been considered, @David_Moss?
It has been considered for the EA Survey! Iâm not sure why this was never prioritised for inclusion, after being raised. But if the meta orgs we work with say they want us to include these questions in the survey going forward, we will.
Good to know! If it doesnât get included in the EA Survey, we might consider doing it ourselves at the national level (M&E budget allowing...).
But obviously, international data that would allow us to compare across regions would be more useful.
Good point. However, my claim is not that âEA should be coolâ and we should work to make it mainstream⊠I pretty much agree with you on that. My point is that EA should put more effort into building its own narrative to the general public (and it doesnât mean trying to make it look cool), otherwise it will be built by someone else, and the outcome will very likely not be beneficial for EA itself.
EA diluting its message to expand would result in more unqualified people applying for jobs on the EA jobs boards, which will make them worse job boards.
Although I donât think EA should go mainstream and have its message diluted, I think this statement is wrong. Unqualified and qualified applicants would have the same probability of stumbling across EA if it ever goes mainstream. I think this idea that âsmart people donât consume mainstream stuffâ is very wrong.
âWhen EA becomes cool, people who donât care about EA show up to secure money and status. When EA becomes cool, that reputation needs defending, which is often corrosive to truth.â
I think thatâs very well said.