Compare Doing Good Effectively is Unusual, for a more positive take on this phenomenon. (E.g. the abstract EA mission is actually pretty important for some to pursue, because otherwise humanity will systematically neglect causes like Shrimp Welfare that donāt have immediate emotional appeal.)
Itās sad that not many people care about doing good as such, but I still think itās worth: (i) trying to co-ordinate those who do, (ii) trying to encourage more others to join them, and (iii) co-operating with others who have more cause-specific motivations that happen to be good ones (whether thatās in global health, animal welfare, AI safety, or whatever).
Overall, Iām not sure why you would think āEA needs a cultural shiftā rather than āwe need more EA-adjacent movements/āsubcultures for people who donāt feel moved by the core EA mission but do or could care more about specific causes.ā Isnāt it better to add than to replace?
On a practical level, I donāt necessarily disagree with anything youāre saying in the first two paragraphs. I tried to address some of what youāre saying in my conclusion, and I donāt think anything in the āmainā argument (benefits missions provide but EA is currently missing) is incompatible with having the abstract ādoing goodā as the core EA thing (so then it just becomes a semantic thing about how we define missions).
As for your last paragraph, I argue for a cultural shift because Iāve personally seen a lot of people who resonate very much with EA intellectually but not emotionally (like here). This is fine when they have an easy transition into a high-impact role and there are less abstract stuff they can feel emotional about, but a lot of people donāt āsurviveā that transition. They are aligned on principles, but EA is a really different community and movement that takes time getting used to. The current EA community seems to not only select for people who share values, but also people who share personality traits. I think thatās bad.
(I do like the subculture idea and it was something I was thinking about as I wrote it! I think thatās 100% a viable path too)
On a more speculative level, the people who I see drifting away strongly tend to be the people who have support networks outside of EA, instead of the people who are more reliant on EA for their social/āemotional needs. Iām sure some of this trend exists for every movement, but I somewhat believe that this trend is larger for EA. This post gets at one of the reasons I have personally hypothesized to have caused thisāthat EA feels cold to a lot of people in a way difficult to describe, and humans are emotions-driven at their core. Regardless of the reason, selecting for members that are socially and emotionally reliant on the movement seems like a recipe for disaster.
Compare Doing Good Effectively is Unusual, for a more positive take on this phenomenon. (E.g. the abstract EA mission is actually pretty important for some to pursue, because otherwise humanity will systematically neglect causes like Shrimp Welfare that donāt have immediate emotional appeal.)
Itās sad that not many people care about doing good as such, but I still think itās worth: (i) trying to co-ordinate those who do, (ii) trying to encourage more others to join them, and (iii) co-operating with others who have more cause-specific motivations that happen to be good ones (whether thatās in global health, animal welfare, AI safety, or whatever).
Overall, Iām not sure why you would think āEA needs a cultural shiftā rather than āwe need more EA-adjacent movements/āsubcultures for people who donāt feel moved by the core EA mission but do or could care more about specific causes.ā Isnāt it better to add than to replace?
On a practical level, I donāt necessarily disagree with anything youāre saying in the first two paragraphs. I tried to address some of what youāre saying in my conclusion, and I donāt think anything in the āmainā argument (benefits missions provide but EA is currently missing) is incompatible with having the abstract ādoing goodā as the core EA thing (so then it just becomes a semantic thing about how we define missions).
As for your last paragraph, I argue for a cultural shift because Iāve personally seen a lot of people who resonate very much with EA intellectually but not emotionally (like here). This is fine when they have an easy transition into a high-impact role and there are less abstract stuff they can feel emotional about, but a lot of people donāt āsurviveā that transition. They are aligned on principles, but EA is a really different community and movement that takes time getting used to. The current EA community seems to not only select for people who share values, but also people who share personality traits. I think thatās bad.
(I do like the subculture idea and it was something I was thinking about as I wrote it! I think thatās 100% a viable path too)
On a more speculative level, the people who I see drifting away strongly tend to be the people who have support networks outside of EA, instead of the people who are more reliant on EA for their social/āemotional needs. Iām sure some of this trend exists for every movement, but I somewhat believe that this trend is larger for EA. This post gets at one of the reasons I have personally hypothesized to have caused thisāthat EA feels cold to a lot of people in a way difficult to describe, and humans are emotions-driven at their core. Regardless of the reason, selecting for members that are socially and emotionally reliant on the movement seems like a recipe for disaster.