On a practical level, I donât necessarily disagree with anything youâre saying in the first two paragraphs. I tried to address some of what youâre saying in my conclusion, and I donât think anything in the âmainâ argument (benefits missions provide but EA is currently missing) is incompatible with having the abstract âdoing goodâ as the core EA thing (so then it just becomes a semantic thing about how we define missions).
As for your last paragraph, I argue for a cultural shift because Iâve personally seen a lot of people who resonate very much with EA intellectually but not emotionally (like here). This is fine when they have an easy transition into a high-impact role and there are less abstract stuff they can feel emotional about, but a lot of people donât âsurviveâ that transition. They are aligned on principles, but EA is a really different community and movement that takes time getting used to. The current EA community seems to not only select for people who share values, but also people who share personality traits. I think thatâs bad.
(I do like the subculture idea and it was something I was thinking about as I wrote it! I think thatâs 100% a viable path too)
On a more speculative level, the people who I see drifting away strongly tend to be the people who have support networks outside of EA, instead of the people who are more reliant on EA for their social/âemotional needs. Iâm sure some of this trend exists for every movement, but I somewhat believe that this trend is larger for EA. This post gets at one of the reasons I have personally hypothesized to have caused thisâthat EA feels cold to a lot of people in a way difficult to describe, and humans are emotions-driven at their core. Regardless of the reason, selecting for members that are socially and emotionally reliant on the movement seems like a recipe for disaster.
On a practical level, I donât necessarily disagree with anything youâre saying in the first two paragraphs. I tried to address some of what youâre saying in my conclusion, and I donât think anything in the âmainâ argument (benefits missions provide but EA is currently missing) is incompatible with having the abstract âdoing goodâ as the core EA thing (so then it just becomes a semantic thing about how we define missions).
As for your last paragraph, I argue for a cultural shift because Iâve personally seen a lot of people who resonate very much with EA intellectually but not emotionally (like here). This is fine when they have an easy transition into a high-impact role and there are less abstract stuff they can feel emotional about, but a lot of people donât âsurviveâ that transition. They are aligned on principles, but EA is a really different community and movement that takes time getting used to. The current EA community seems to not only select for people who share values, but also people who share personality traits. I think thatâs bad.
(I do like the subculture idea and it was something I was thinking about as I wrote it! I think thatâs 100% a viable path too)
On a more speculative level, the people who I see drifting away strongly tend to be the people who have support networks outside of EA, instead of the people who are more reliant on EA for their social/âemotional needs. Iâm sure some of this trend exists for every movement, but I somewhat believe that this trend is larger for EA. This post gets at one of the reasons I have personally hypothesized to have caused thisâthat EA feels cold to a lot of people in a way difficult to describe, and humans are emotions-driven at their core. Regardless of the reason, selecting for members that are socially and emotionally reliant on the movement seems like a recipe for disaster.