One theory that I’m fond of, both because it has some explanatory power, and because unlike other theories about this with explanatory power, it is useful to keep in mind and not based as directly on misconceptions, goes like this:
-A social group that has a high cost of exit, can afford to raise the cost of staying. That is, if it would be very bad for you to leave a group you are part of, the group can more successfully pressure you to be more conformist, work harder in service of it, and tolerate weird hierarchies.
-What distinguishes a cult, or at least one of the most important things that distinguishes it, is that it is a social group that manually raises the cost of leaving, in order to also raise the cost of staying. For instance it relocates people, makes them cut off other relationships, etc.
-Effective Altruism does not manually raise the cost of leaving for this purpose, and neither have I seen it really raise the cost of staying. Even more than most social groups I have been part of, being critical of the movement, having ideas that run counter to central dogmas, and being heavily involved in other competing social groups, are all tolerated or even encouraged. However,
-The cost of leaving for many Effective Altruists is high, much of this self-inflicted. Effective Altruists like to live with other Effective Altruists, make mostly Effective Altruist close friends, enter romantic relationships with other Effective Altruists, work at Effective Altruist organizations, and believe idiosyncratic ideas mostly found within Effective Altruism. Some of this is out of a desire to do good, speaking from experience, much of it is because we are weirdos who are most comfortable hanging out with people who are similar types of weirdos to us, and have a hard time with social interactions in general. Therefore,
-People looking in sometimes see things from point four, the things that contribute to the high cost of leaving, and even if they can’t put what’s cultish about it into words, are worried about possible cultishness, and don’t know the stuff in point three viscerally enough to be disuaded of this impression. Furthermore, even if EA isn’t a cult, point four is still important, because it increases the risk of cultishness creeping up on us.
Overall, I’m not sure what to do with this. I guess be especially vigilant, and maybe work a little harder to have as much of a life as possible outside of Effective Altruism. Anyway, that’s my take.
Hey Aman, thanks for the post. It does seem a bit outdated that the top picture for altruism is a French painting from hundreds of years ago. EA should hope to change the cultural understanding of doing good from something that’s primarily religious or spiritual, to something that can be much more scientific and well-informed.
I do think some of the accusations of EA being a cult might go a bit deeper. There aren’t many other college clubs that would ask you to donate 10% of your income or determine your career plans based on their principles. One community builder who’d heard similar accusations here traced the concerns to EA’s rapid growth in popularity and a certain “all-or-nothing” attitude in membership. Here’s another person who had some great recommendations for avoiding the accusation. I particularly liked the emphasis on giving object-level arguments rather than appealing to authority figures within EA.
Overall, it seems tough for an ethical framework + social movement to avoid the accusation at times, but hopefully our outreach can be high quality enough to encourage a better perception.
Thanks for the comment! I agree with your points—there are definitely elements of EA, whether they’re core to EA or just cultural norms within the community, that bear stronger resemblances to cult characteristics.
My main point in this post was to explore why someone who hasn’t interacted with EA before (and might not be aware of most of the things you mentioned) might still get a cult impression. I didn’t mean to claim that the Google search results for “altruism” are the most commonreason why people come away with a cult impression. Rather, I think that they might explain a few perplexing cases of cult impressions that occur before people become more familiar with EA. I should have made this distinction clearer, thanks for pointing it out :)
A key characteristic of a cult is a single leader who accrues a large amount of trust and is held by themselves and others to be singularly insightful. The LW space gets like that sometimes, less so EA, but they are adjacent communities.
The ability to do new basic work noticing and fixing those flaws is the same ability as the ability to write this document before I published it, which nobody apparently did, despite my having had other things to do than write this up for the last five years or so. Some of that silence may, possibly, optimistically, be due to nobody else in this field having the ability to write things comprehensibly—such that somebody out there had the knowledge to write all of this themselves, if they could only have written it up, but they couldn’t write, so didn’t try. I’m not particularly hopeful of this turning out to be true in real life, but I suppose it’s one possible place for a “positive model violation” (miracle). The fact that, twenty-one years into my entering this death game, seven years into other EAs noticing the death game, and two years into even normies starting to notice the death game, it is still Eliezer Yudkowsky writing up this list, says that humanity still has only one gamepiece that can do that. I knew I did not actually have the physical stamina to be a star researcher, I tried really really hard to replace myself before my health deteriorated further, and yet here I am writing this. That’s not what surviving worlds look like.
I don’t necessarily disagree with this analysis, in fact, I have made similar observations myself. But the social dynamic of it all pattern-matches to cult-like, and I think that’s a warning sign we should be wary of as we move forward. In fact, I think we should probably have an ongoing community health initiative targeted specifically at monitoring signs of group-think and other forms of epistemic failure in the movement.
Yeah, I’ve had several (non-exchange) students ask me what altruism means—my go-to answer is “selflessly helping others,” which I hope makes it clear that it describes a practice rather than a dogma.
The real issue is really the fact that AI tends to be both the public facing side of EA, and one where there’s a lot of existential claims that sound similar to cultish claims like “If AGI happens, we’ll go extinct.” We really need specific cause areas for new EAs to make it less a personal identity.
One theory that I’m fond of, both because it has some explanatory power, and because unlike other theories about this with explanatory power, it is useful to keep in mind and not based as directly on misconceptions, goes like this:
-A social group that has a high cost of exit, can afford to raise the cost of staying. That is, if it would be very bad for you to leave a group you are part of, the group can more successfully pressure you to be more conformist, work harder in service of it, and tolerate weird hierarchies.
-What distinguishes a cult, or at least one of the most important things that distinguishes it, is that it is a social group that manually raises the cost of leaving, in order to also raise the cost of staying. For instance it relocates people, makes them cut off other relationships, etc.
-Effective Altruism does not manually raise the cost of leaving for this purpose, and neither have I seen it really raise the cost of staying. Even more than most social groups I have been part of, being critical of the movement, having ideas that run counter to central dogmas, and being heavily involved in other competing social groups, are all tolerated or even encouraged. However,
-The cost of leaving for many Effective Altruists is high, much of this self-inflicted. Effective Altruists like to live with other Effective Altruists, make mostly Effective Altruist close friends, enter romantic relationships with other Effective Altruists, work at Effective Altruist organizations, and believe idiosyncratic ideas mostly found within Effective Altruism. Some of this is out of a desire to do good, speaking from experience, much of it is because we are weirdos who are most comfortable hanging out with people who are similar types of weirdos to us, and have a hard time with social interactions in general. Therefore,
-People looking in sometimes see things from point four, the things that contribute to the high cost of leaving, and even if they can’t put what’s cultish about it into words, are worried about possible cultishness, and don’t know the stuff in point three viscerally enough to be disuaded of this impression. Furthermore, even if EA isn’t a cult, point four is still important, because it increases the risk of cultishness creeping up on us.
Overall, I’m not sure what to do with this. I guess be especially vigilant, and maybe work a little harder to have as much of a life as possible outside of Effective Altruism. Anyway, that’s my take.
Hey Aman, thanks for the post. It does seem a bit outdated that the top picture for altruism is a French painting from hundreds of years ago. EA should hope to change the cultural understanding of doing good from something that’s primarily religious or spiritual, to something that can be much more scientific and well-informed.
I do think some of the accusations of EA being a cult might go a bit deeper. There aren’t many other college clubs that would ask you to donate 10% of your income or determine your career plans based on their principles. One community builder who’d heard similar accusations here traced the concerns to EA’s rapid growth in popularity and a certain “all-or-nothing” attitude in membership. Here’s another person who had some great recommendations for avoiding the accusation. I particularly liked the emphasis on giving object-level arguments rather than appealing to authority figures within EA.
Overall, it seems tough for an ethical framework + social movement to avoid the accusation at times, but hopefully our outreach can be high quality enough to encourage a better perception.
Thanks for the comment! I agree with your points—there are definitely elements of EA, whether they’re core to EA or just cultural norms within the community, that bear stronger resemblances to cult characteristics.
My main point in this post was to explore why someone who hasn’t interacted with EA before (and might not be aware of most of the things you mentioned) might still get a cult impression. I didn’t mean to claim that the Google search results for “altruism” are the most common reason why people come away with a cult impression. Rather, I think that they might explain a few perplexing cases of cult impressions that occur before people become more familiar with EA. I should have made this distinction clearer, thanks for pointing it out :)
A key characteristic of a cult is a single leader who accrues a large amount of trust and is held by themselves and others to be singularly insightful. The LW space gets like that sometimes, less so EA, but they are adjacent communities.
Recently, Eliezer wrote
I don’t necessarily disagree with this analysis, in fact, I have made similar observations myself. But the social dynamic of it all pattern-matches to cult-like, and I think that’s a warning sign we should be wary of as we move forward. In fact, I think we should probably have an ongoing community health initiative targeted specifically at monitoring signs of group-think and other forms of epistemic failure in the movement.
I’ve had quite a few people ask me “What’s altruism?” when running university clubs fair stalls for EA Wellington.
Yeah, I’ve had several (non-exchange) students ask me what altruism means—my go-to answer is “selflessly helping others,” which I hope makes it clear that it describes a practice rather than a dogma.
We had that as well with EA USyd, but they were all security guards etc working on the campus, or some exchange students.
The real issue is really the fact that AI tends to be both the public facing side of EA, and one where there’s a lot of existential claims that sound similar to cultish claims like “If AGI happens, we’ll go extinct.” We really need specific cause areas for new EAs to make it less a personal identity.