I agree that this is a problem, but I don’t agree with the causal model and so I don’t agree with the solution.
Looking at the EA Survey, the best determinant of what cause a person believes to be important is the one that they thought was important before they found EA and considered cause prioritization.
I’d guess that the majority of the people who take the EA Survey are fairly new to EA and haven’t encountered all of the arguments etc. that it would take to change their minds, not to mention all of the rationality “tips and tricks” to become better at changing your mind in the first place. It took me a year or so to get familiar with all of the main EA arguments, and I think that’s pretty typical.
TL;DR I don’t think there’s good signal in this piece of evidence. It would be much more compelling if it were restricted to people who were very involved in EA.
Moreover, they often converge not because they moved there to be with those people, but because they ‘became’ EAs there.
I’d propose a different model for the regional EA groups. I think that the founders are often quite knowledgeable about EA, and then new EAs hear strong arguments for whichever causes the founders like and so tend to accept that. (This would happen even if the founders try to expose new EAs to all of the arguments—we would expect the founders to be able to best explain the arguments for their own cause area, leading to a bias.)
In addition, it seems like regional groups often prioritize outreach over gaining knowledge, so you’ll have students who have heard a lot about global poverty and perhaps meta-charity who then help organize speaker events and discussion groups, even though they’ve barely heard of other areas.
Based on this model, the fix could be making sure that new EAs are exposed to a broader range of EA thought fairly quickly.
I agree that this is a problem, but I don’t agree with the causal model and so I don’t agree with the solution.
I’d guess that the majority of the people who take the EA Survey are fairly new to EA and haven’t encountered all of the arguments etc. that it would take to change their minds, not to mention all of the rationality “tips and tricks” to become better at changing your mind in the first place. It took me a year or so to get familiar with all of the main EA arguments, and I think that’s pretty typical.
TL;DR I don’t think there’s good signal in this piece of evidence. It would be much more compelling if it were restricted to people who were very involved in EA.
I’d propose a different model for the regional EA groups. I think that the founders are often quite knowledgeable about EA, and then new EAs hear strong arguments for whichever causes the founders like and so tend to accept that. (This would happen even if the founders try to expose new EAs to all of the arguments—we would expect the founders to be able to best explain the arguments for their own cause area, leading to a bias.)
In addition, it seems like regional groups often prioritize outreach over gaining knowledge, so you’ll have students who have heard a lot about global poverty and perhaps meta-charity who then help organize speaker events and discussion groups, even though they’ve barely heard of other areas.
Based on this model, the fix could be making sure that new EAs are exposed to a broader range of EA thought fairly quickly.