A hypothesis for why some people mistake EA for a cult

Last week, someone posted this comment on the Effective Altruism USC Instagram (we deleted it):

This question isn’t out of the ordinary for us at USC, and I’m sure other EA groups get asked the same thing.

In our opinion, our Instagram account might not be the most polished or sleek, but it certainly doesn’t give off any cult-like vibes. Other elements of EA might understandably be seen as cult-like (e.g. ubiquitous “retreats,” mysterious wealthy donors funding our dinners, debate about which year AI will kill everyone)--but not our Instagram, nor any other EA uni group’s Instagram that I’ve seen.

The comment we received was on a post announcing an AI governance speaker event, and (to my knowledge) the commenter has not yet interacted with anyone on our leadership team–so it’s possible that this cult impression came solely from seeing our (relatively normal) online presence.

So what gives?

Here’s a hypothesis that could explain some cases of mistaken impressions (although obviously not all): when people look up “altruism,” they find potentially culty-looking things.

There’s already been plenty of discussion that the word “altruism” itself reminds people of other dogmatic “isms” (Positive Impact Society Erasmus wrote a post mentioning it, with a recent update). From personal experience, a lot of people don’t know what “altruism” means–so maybe they go to look it up on Google. And what do they find?

Even though this description is factually correct, it prominently mentions spirituality, tradition, and religion, which could plausibly seem cult-like for a movement that advertises ourselves as “doing good using evidence and reasoning”. People looking for a quick definition might briefly see those words and assume EA is some strange system of worship. The 18th-century oil painting probably doesn’t help.

The second Google result I get is this:

For someone who’s just been primed with the first definition, this might only raise more red flags. They’re about the “moral practice” of a “traditional virtue” and they want us to “promote others’ welfare” at our own cost?

I’m aware that this is an uncharitable reading of the Google search results. I’m trying to trace out potential paths that might lead someone from seeing the words “effective altruism” without interacting with us or knowing much about it, to thinking we’re a cult. If anyone has another plausible story for how this might happen, I’d love to hear it.

Some things we can do about it

  1. Give feedback on Google’s knowledge panel (the rectangular box on the right when you look something up), so that it’s more about helping others and less about religion/​spirituality. I’m not sure what process Google uses to review feedback requests, so I don’t know if they’d be more likely to consider changes if multiple people submit the same comment–but it seems worth trying anyway. To give feedback, click the “Feedback” link at the bottom of the knowledge panel.

  2. Edit the Wikipedia page for altruism again. There’s already content about EA there, but the initial section of the article could be tweaked. It could also help to replace the main image with something else that seems more contemporary. (I haven’t done this yet, but I might try to).

  3. Explicitly define “altruism” when giving EA intro pitches, whether verbally or in writing. Most EA intro articles that I’m aware of (like this one) don’t do this–and people who aren’t familiar with the word might then Google it, leading to the cult impression. For the future, it’s probably low-cost to add a quick sentence giving a definition of altruism to any intro EA content, to clarify any potential misinterpretations.

  4. Ask why people outside the EA community associate us with cultiness, to see whether this hypothesis even has merit. I’m not sure if any focus group-style studies have been done on the EA community’s image before, but maybe that’s worthwhile. (And if anyone knows of any, I’d be curious to read them).