EA may look like a cult (and it’s not just optics)

Is EA a cult?

Cultishness is a spectrum. As I’ll demonstrate, EA has some characteristics normally associated with cults, which can give both outsiders and those already engaged with EA a very bad impression. I’m not arguing though that the situation is so bad that EA is a cult in the extreme sense of the word. But I think that people who don’t know much about EA and get the impression that it’s a cult are not crazy to think that.

A narrative of EA as a cult

What if I told you about a group of people calling themselves Effective Altruists (EAs) who wish to spread their movement to every corner of the world. Members of this movement spend a lot of money and energy on converting more people into their ideology (they call it community building). They tend to target young people, especially university students (and increasingly high school students). They give away free books written by their gurus and run fellowship programs. And they will gladly pay all the expenses for newcomers to travel and attend their events so that they can expose them to their ideas.

EAs are encouraged to consult the doctrine and other movement members about most major life decisions, including what to do for a living (career consultation), how to spend their money (donation advice), and what to do with their spare time (volunteering). After joining the movement, EAs are encouraged to give away 10% or more of their income to support the movement and its projects. It’s not uncommon for EAs to socialize and live mostly with other EAs (some will get subsidized EA housing). Some EAs want even their romantic partners to be members of the community.

While they tend to dismiss what’s considered common sense by normal mainstream society, EAs will easily embrace very weird-sounding ideas once endorsed by the movement and its leaders.

Many EAs believe that the world as we know it may soon come to an end and that humanity is under existential threat. They believe that most normal people are totally blind to the danger. EAs, on the other hand, have a special role in preventing the apocalypse, and only through incredible efforts can the world be saved. Many of these EAs describe their aspirations in utopian terms, declaring that an ideal world free of aging and death is waiting for us if we take the right actions. To save the world and navigate the future of humanity, EAs often talk about the need to influence governments and public opinion to match their beliefs.

It’s not just optics

While I’ve focused on how EA might be perceived from the outside, I think that many of the cult-like features of EA pose a real issue, so it’s not just a PR problem. I don’t yet have a good mental model of all the ways in which it plays out, but I believe there are other negative consequences. The impression of a cult could also explain why some of the recent media attention on EA hasn’t been very positive (I’m not saying it’s the only reason).

How to make EA less cultish

Many of the features of EA that make it look and sound like a cult (e.g. pursuit of growth, willingness to accept unconventional ideas) are quite essential to the project of doing the most good, so I’m not suggesting to automatically get rid of everything that could possibly be perceived as cultish (on the other hand, I think that how EA is perceived is quite important, so we shouldn’t ignore these considerations either). Having said that, I believe there are cultural norms we could embrace that would push us away from cultishness without significantly compromising other goals.

Some helpful anti-clutishness norms that are already established in EA to some extent and I’d like to see further cultivated include:

Other norms that I’d like to see include:

  • Advertising EA too aggressively can be counterproductive (even though we want EA to grow).

  • Having one’s whole life revolved around EA should be considered unhealthy.

  • Being known as the person who only talks about EA might be a red flag.

  • Going against social norms is not a virtue; it’s a price we sometimes have to pay for doing the right thing.

  • Moderation is a virtue.

  • Mixing work interactions and intimate relationships (such as sex or residence) shouldn’t be taken lightly.

  • Conflicts of interests should be taken seriously.

  • It’s important to seriously consider what non-EA people and organizations have to say, even when they don’t think and communicate in EA’s preferred style (and it might be tempting to dismiss as grounded in bad epistemics).

Others have also written about different aspects of the problem and potential solutions (see for example 1, 2, 3, 4).

Summary

I don’t think that EA is a cult in the worst sense of the word, but it seems to have many cult-like features that easily give a bad impression (mainly to outsiders but not only). There are cultural norms and attitudes that, if cultivated, could make EA less cultish.

Acknowledgements

I want to thank Edo Arad, Gidon Kadosh and Sella Nevo for their feedback. Insofar as this post still sucks, it’s entirely my fault.