I think it may be illuminating to conceptualise that EA has several āattractor failure modesā that it can coalesce into if insufficient attention is paid to methods of making EA community spaces not do that. Youāve noted some of these attractor failures in your post, and they are often related to other things that overlap with EA. They include (but are not limited to):
The cultic self-help conspiratorial milieu (probably from rationalism)
Racism and eugenicist ideas
Doomspirals (many versions depending on cause area, but āAI will kill us all P(doom) = 95%ā is definitely one of them)
The question, then, is how does one balance community moderation to both promote the environment of individual truth seeking necessary to support EA as a philosophical concept, while also striving to avoid these, given a documented history within EA of them leading to things that donāt work out so well? I wonder what CEAās community health team have said on the matter.
Iām very glad of Reflective Altruismās work and Iām sorry to see the downvotes on this post. Would you consider a repost as a main post with dialed down emotive language in order to better reach people? Iād be happy to give you feedback on a draft.
Thanks. Iāll think about the idea of doing a post, but, honestly, what I wrote was what I wanted to write. I donāt see the emotion or the intensity of the writing as a failure or an indulgence, but as me saying what I really mean, and saying what needs to be said. What goodās sugar-coating it?
Something that anyone can do (David Thorstad has given permission in comments Iāve seen) is simply repost the Reflective Altruism posts about LessWrong and about the EA Forum here, on the EA Forum. Those posts are extremely dry, extremely factual, and not particularly opinionated. Theyāre more investigative than argumentative.
I have thought about what, practically, to do about these problems in EA, but I donāt think I have particularly clear thoughts or good thoughts on that. An option that would feel deeply regrettable and unfortunate to me would be for the subset of the EA movement that shares my discomfort to try to distinguish itself under some label such as effective giving. (Someone could probably come up with a better label if they thought about it for a while.)
I hope that there is a way for people like me to save what they love about this movement. I would be curious to hear ideas about this from people who feel similarly.
Thanks for sharing your misgivings.
I think it may be illuminating to conceptualise that EA has several āattractor failure modesā that it can coalesce into if insufficient attention is paid to methods of making EA community spaces not do that. Youāve noted some of these attractor failures in your post, and they are often related to other things that overlap with EA. They include (but are not limited to):
The cultic self-help conspiratorial milieu (probably from rationalism)
Racism and eugenicist ideas
Doomspirals (many versions depending on cause area, but āAI will kill us all P(doom) = 95%ā is definitely one of them)
The question, then, is how does one balance community moderation to both promote the environment of individual truth seeking necessary to support EA as a philosophical concept, while also striving to avoid these, given a documented history within EA of them leading to things that donāt work out so well? I wonder what CEAās community health team have said on the matter.
Iām very glad of Reflective Altruismās work and Iām sorry to see the downvotes on this post. Would you consider a repost as a main post with dialed down emotive language in order to better reach people? Iād be happy to give you feedback on a draft.
Thanks. Iāll think about the idea of doing a post, but, honestly, what I wrote was what I wanted to write. I donāt see the emotion or the intensity of the writing as a failure or an indulgence, but as me saying what I really mean, and saying what needs to be said. What goodās sugar-coating it?
Something that anyone can do (David Thorstad has given permission in comments Iāve seen) is simply repost the Reflective Altruism posts about LessWrong and about the EA Forum here, on the EA Forum. Those posts are extremely dry, extremely factual, and not particularly opinionated. Theyāre more investigative than argumentative.
I have thought about what, practically, to do about these problems in EA, but I donāt think I have particularly clear thoughts or good thoughts on that. An option that would feel deeply regrettable and unfortunate to me would be for the subset of the EA movement that shares my discomfort to try to distinguish itself under some label such as effective giving. (Someone could probably come up with a better label if they thought about it for a while.)
I hope that there is a way for people like me to save what they love about this movement. I would be curious to hear ideas about this from people who feel similarly.