How To Prevent EA From Ever Turning Into a Cult
Let me start off by saying: I don’t think EA qualifies as a cult. There are several defining characteristics of cults that simply don’t apply to the EA movement. However, like most communities EA has some cult-like attributes. In this post I’ll go through them and suggest some changes that might mitigate the risk of the movement ever harming its members.
This post will be based on my personal experience as a highly engaged EA for the past couple of years. Since I don’t want my name to be the first people associate with “EA as a cult” I choose to publish this anonymously.
This is an important topic for several reasons:
So that people who just take a quick glance at the movement won’t lose interest (”Sounds like a cult to me”).
To prevent people from having a bad experience within the community today (ie. feeling pressured or taken advantage of)
To prevent the movement, or parts of it, from ever turning into a cult in the future (or in other ways abuse the influence it has on its members)
While putting together this post, I found that though there seems to be a popular consensus on what one is referring to when using the word “cult”, there is no established academic definition or checklist. [1] The same goes for the more neutral term ”highly demanding group”. If I were to define it, I’d say something like ”a closed group that use a higher cause and social pressure to motivate its members to forsake their own needs to satisfy the personal needs of members higher up the group’s hierarchy” (for example The Church of Scientology, Jonestown, NXIVM). That is how I’ll be using the word throughout this post.
The exact definition doesn’t really matter – we don’t just want to ”pass the cult checklist” but to build a considerate and healthy community.
Why EA is not a cult
Main reasons why I don’t think the EA movement qualifies as a cult:
You’re free to come and go. The main characteristic of a cult is that there is a barrier between the group and the outside world. You join at a cost – and the cost is even higher to leave. When it comes to EA, there is no threshold to be able to call yourself an effective altruist (i.e. initiation ritual, oath, or fee to pay). Neither is there a punishment for leaving the movement (harassment, financial debt, social exclusion). You can come and go as you please.
Free flow of ideas – including criticism. You don’t have to take expensive courses or volunteer at the leader’s commune in order to ”gain enlightenment”. The ideas of effective altruism are published online and open-source for everyone to consume and discuss. There is no formal hierarchy on who’s interpretation is orthodox. People are encouraged to express disagreement, as well as consider arguments from people outside of the movement.
No formal leader or hierarchy. Though there are some thinkers within the community with a more widespread influence than others, there are no formal authorities. No leaders to obey, tiers of enlightenment, or mandatory career ladders to climb.
There are, however, cult-like aspects of the movement as I see it. I’ll present them here, together with actions that I think will address the problem and put safeguarding mechanisms in place for the EA community to grow in a healthy direction.
Centralization
Cults are usually run by a close-knit group, an inner circle consisting of the leader and a selected few. This increases the risk of corruption, biases, and abuse of power.
EA doesn’t have this formal hierarchy, but rather a network of individuals and organizations, including the Centre for Effective Altruism, 80,000 Hours and Open Philanthropy. This network, however, is geographically centralized to the EA hot spots of Oxford/London and the Bay Area. In practise, this means that there are a limited number of people having a disproportionally large influence on this global, growing movement.
As in most communities, these individuals not only work and assign grants to each other, but also party and have intimate relationships. Measures need to be in place to prevent that these humane tendencies turn into scandals.
Decentralization of the EA movement – encouraging geographical diffusion of the movement’s important organizations and institutions.
Professionalization of the movement. Discourage having personal/intimate relationships with people one works with. Relocating individuals with a risk of conflicting interests after, for example, having started dating someone.
Increasing the number of EA spokespersons in the external communication to build brand resilience in the case of a scandal.
Local groups seeking external fundings, preferably from several sources other than the Centre for Effective Altruism and EA Infrastructure Funds, to be able to make independent decisions.
More engagement and cooperation with relevant non-EA people and organizations.
Elitism and uniformity
There is a notion within the movement that since the ideas are complex and hard to grasp, they are preferably taught to a selected few (young) intellectuals who will later go on to infiltrate important societal institutions. Kind of like Illuminati or the Masons. There are good arguments to be made for this strategy. Some people seem to have the potential to do more good than others, and it’s wise to attract and invest in these individuals when community building resources are limited.
However, this “ivory tower”-strategy has major downsides. When we lack systematic methods to identify these individuals, we will grow the community in a biased direction (in EA referred to as the Founder Effect).
We miss out on relevant criticism, feedback, and revolutionizing new ideas from people and groups we wouldn’t consider asking for advice (ie. most marginalized groups). And we miss out on lots of potentially high-impact people who are intelligent, kind and driven, but who wouldn’t want to be part of an intellectual, elitist club.
To use a personal example, I have an acquaintance who has worked in public affairs for a long time, and who has shown an interest in the EA idea of “doing the most good” and would likely be a tremendous asset to the movement. But I’m reluctant to bring them to an actual event because I know most people there will be a) significantly younger and b) discussing the suffering of insects (or another initiated, nerdy subject).
What to do about it:
Strive for diversity in external and internal communication. Invite speakers and highlight EAs of different social backgrounds, age, race, and educational level.
Avoid jargon and abbreviations in external and internal communication (lectures, forum posts, podcasts) to lower the threshold for people to engage in discussion.
Diversify community activities to allow for members with different lifestyles. For people who have a family, breakfast seminars might be preferred over evening meetups.
Appear on mainstream media. The main ideas of EA are easy enough to explain for a good communicator. By appearing regularly on (high-fidelity) mainstream media the movement gains transparency and legitimacy.
More ideas in this post.
Donations
Some cult-like organizations are thinly veiled pyramid schemes to milk money from their members. Naturally, some people react with scepticism when they learn that the EA movement has a tradition of charitable donations, where some members even have taken a pledge to donate a substantial part of their lifelong income (often 10 percent) to EA causes.
The important distinction here is, of course, that these generous donations don’t end up in the back pocket of senior EAs. Instead, they are thoughtfullt invested to make the world a better place. Mechanisms need to be in place to make sure that’s how it stays.
Frugality needs to stay a virtue in the EA culture to not attract members who are motivated primarily by money. Not that we should be offering sleeping bags and tents on our retreats – the ”EA standard” needs to be comfortable enough to be considered by individuals of the general public. But grants should not be used to pay for excesses for a selected few (ie. fancy hotels, resorts, or domestic work).
Tranparency. Grants to individuals/projects within the EA movement need to be even more transparent and easily motivated than grants to independent organizations. Open declarations on how the money is spent.
Encourage ordinary EA donors to give to independent organizations/charities, rather than the movement itself (not counting self-covering fees for conferences etc). Its better if EA orgs are funded by philanthropists with fortunes to spare.
Volunteering and work
Volonteering is a common occurrence in most social movements. However, people are easier to overwork and take advantage of when they are working for something they believe to be a greater good. Some cults use junior members to run personal errands for senior members or work under slave-like conditions. The EA movement needs to be mindful of these tensions when engaging volunteers.
One dimension of this is the suggested career path to become an executive assistant for an impactful person. Even though the assistants I know within the community are being treated better than their counterparts in business, this role could potentially be misused. Especially if the assistant is working for someone they look up to or whose time/wellbeing they consider to be of more importance than their own.
Employees and executive assistants within the community should be offered competitive salaries and encouraged to join a worker’s union.
Keep an eye out for the community’s volunteers and make sure no-one’s being overworked. Remember that community building is a marathon, not a sprint.
Support community builders. Community builders, who often are young and rather inexperienced, should receive support to make sure staff and volunteers have a healthy work environment.
Belonging and sense of community
It’s great when people find friends, partners, and new colleagues within a community. At the same time, it means the social costs to criticise or leave that community gets higher.
Encourage people having relationships outside of the EA community. Both personal and professional.
Ask them for advice. If you ask people within the EA community for feedback on ideas and decisions, ask friends, family and professional contacts outside of the community as well. They might have a different perspective.
If someone, someday, says they’re worried about the role ”this EA thing” has taken in your life – listen and take their concern seriously. Is there a change to be made?
Sex
Where there’s people, there’s sex. And like influence might be used to gain money, it might also be used for sexual exploitation. We must make sure people are safe and that consent is key within the EA community.
Apart from sexual abuse, there are more subtle aspects to consider. Some were brought up in this recent post on power dynamics. Seniors might use their status to sleep with less-experienced members who, when the star-struckness has worn off, feel naive and used. People might feel pressured to follow community norms they are not comfortable with, for example joining a cuddle puddle if everyone else is. In some cults matchmaking is part of the community building, where people trust their leaders enough to recommend who they ought to marry or sleep with.
Apart from being painful for the victims, sexual exploitation in cult-like settings make great media scandals. Especially when it comes to norm-breaking sexual practices like polyamory and BDSM.
Senior community members should avoid having intimate relationships with junior members. If it does happen, informed consent (including setting clear expectations beforehand) is more important than ever. Never offer professional opportunities on the basis of you having (or having had) an intimate relationship.
Always ask for consent. Rather one time too many than one time too few. Do this in all situations that involves intimacy or nudity (”is it ok if I share this blanket?”)
If you organize community events like retreats, think through how you can safeguard people’s different levels of integrity. Be too conservative rather than too frivolous (there are other communities for that). For example, if there’s sauna bathing/swimming, set using bathing suits as the norm.
When the community becomes large enough to have some sort of EA dating platform, it shouldn’t be centrally run, and the data must be securely stored.
I love being part of this community. With these measures in place, I think we’ll be able to grow even more, staying kind and considerate along the way.
- ^
Though some have tried: Group Psychological Abuse Scale
There are many considerations of relevance for these choices besides the risk of becoming or appearing like a cult. My sense is that this post may overestimate the importance of that risk relative to those other considerations.
I also think that in some cases, you could well argue that the sign is the opposite to that suggested here. E.g. frugality could rather be seen as evidence of cultishness.
Could you elaborate more on (some of) these considerations and why you think the cultishness risk is being overestimated relative to them?
My intuition is that it’s being generally underestimated, as at least two cults have already sprung from EA-adjacent circles (one, two). While I don’t think the ideas behind them are currently prevalent in EA, I do think the intellectual environments that brought them forth are, to some meaningful extent.
I can’t speak for OP, but it looks to me more like a poor choice of words, as OP explicitly wrote:
(I’m not sure what’s meant by that last one). OP also suggested paying competitive salaries for EA jobs (which is what’s already happening, at least in the roles relevant to me).
This is not to say I think all of these ideas are necessarily good. On the contrary, becuase I think this consideration is important, I’d value dialogue on how to succeed in preventing it, and I don’t expect the first few ideas by anyone to all be right.
I didn’t say the cultishness risk is generally overestimated. I said that this particular post overestimates that risk relative to other considerations, which are given little attention. I don’t think it’s right to suggest a long list of changes based on one consideration alone, while mostly neglecting other considerations. That is especially so since the cult notion is anyway kind of vague.
Do you think it would be better to not suggest any action, or to filter these suggestions without any input from other people? To me it reads like “here are some ideas for prevention” rather than “we must do all of these immediately”. Though at least some of them look obviously true, like encouraging having non-EA friendships and discouraging intimate relationships between senior and junior employees of EA orgs.
I’m not sure what you mean. I’m saying the post should have been more carefully argued.
I assume you hereby mean the same org (I think that’s the natural reading). But the post rather says:
That’s a very different suggestion.
Thanks to Anonymous for a provocative and important post.
From my perspective as a middle-aged evolutionary psychologist who studies sexual selection, EA certainly can give off some cultish vibes, insofar as cults tend to have some key features centered around controlling the mating effort and money of young adults.
Considered as a cultural way to hack human mating psychology, cults tend to involve mostly young single people, recruited into an ideology that alienates them from their extended families and the broader mating market, that channels their status-seeking and mating-effort instincts into some project ‘for the greater good of humanity’, and that expects high levels of donations to cult causes, combined with a frugal and self-effacing lifestyle.
The cult leaders discourage formation of pair bonds or polyamorous networking among the cult followers, and discourage marriage and kids, which could distract from the all-important mission. The normal human efforts to attract mates through intelligence, creativity, wit, and verbal fluency are channeled into arcane and endless debates about details of the cult ideology, rather than leading to actual mating and reproduction. The cult members end up speaking with an odd, somewhat pretentious lexicon that’s off-putting to outsiders, and that limits their ability to attract mates outside the cult.
The monetary benefits to the cult leaders are usually money-laundered and status-laundered through official organizations (church, charity, foundation, or think tank), in the form of generous salaries, perks, and grants, rather than through outright criminal theft of donations (as in the usual Ponzi schemes).
The cult promotes ageism and distrust in older, professionally successful outsiders, who could provide some skeptical counter-balance against the cult’s youthful, utopian ideology.
By these criteria, EA can sometimes look a bit cultish. But then, by the same criteria, most academic disciplines are also quite cultish. As are most start-up companies. As are most organizations that end up having high impact in the world....
Based on my personal experiences, this is at least as much of an issue in the EA and Rationality communities as it is in society as a whole.
I swear I had the idea for this kind of post yesterday too. But this is much better than what I could’ve written, especially thanks to the concrete suggestions.
Well done on writing up this post, it contains a long list of things we could do to avoid looking like a cult, but I agree with Stephen Schubert. A group that spends all of its energy seeming respectable is unlikely to have a massive impact on the world. When I was first involved in EA I made this mistake by being reluctant to talk about AI risk, but I underestimated people’s receptiveness.
I now lean towards it being important for EA to keep some of its weirdness as a defence against those who are fully enmeshed in the social reality.
As I said before, this isn’t a bad essay, but it would be stronger if it covered less issues and engaged more with the reasons against adopting the proposed policies.
This is an older post now, so I have no idea if anyone will see this, but it seems to me that you almost need “pockets” of cultishness in the broader EA movement. This follows up from Geoffrey Miller’s final sentence on his comment, about how a lot of impactful movements do seem a bit cultish. Peter Thiel writes really well about why some start-ups seem cultish (and why they should be) in Zero to One, and I think I agree with him: it does seem to me that a sense of unity/mission-alignment and we’re-better-than-everyone-else can produce extraordinary results. Sometimes this is extraordinarily bad (like Adam Neumann and WeWork) and sometimes it’s extraordinarily good (like Steve Jobs and Apple, Jack Dorsey and Twitter, Bill Gates and Microsoft, etc), where certain people can motivate others to tirelessly perform extremely high-value work.
Obviously, cultishness has major downsides. One is external: for example, proponents of the environmental movement were frequently dismissed as hippies before environmentalism went mainstream, and I wouldn’t want the same to happen to any EA. The second is internal: as you’ve noted, problems like sexual harassment and abuse come up, which is obviously extremely traumatising for the victims involved.
I’d say that one thing I’m saddened by is the relative lack of public awareness of EA or the big EA causes (x-risk, global health, animal welfare, etc), and in a way, it may require us to become more cultish to solve that problem. There’s a kind of optimal stopping problem at play here in that once EA becomes more cultish, it’s hard to make it less cultish (at least in the eyes of the non-EA public) but if EA is too non-cultish, I fear that we won’t appropriately be able to spread the word. I’m also afraid that a lot of our community-building efforts aren’t very high-leverage, and seem like they often fizzle out, particularly at universities. It’s great that we have such a huge collection of smart people working on important stuff, but we might need a few cult leader personalities (or, to use Ayn Rand’s words, “prime movers”) to really move the needle.
One thing I’ve been thinking about—which perhaps flies in the face of what I’ve just said about spreading awareness—is the need to prevent reputational tail risk for the EA movement altogether. For example, can we spread awareness of key issues without mentioning EA, and can we get more people to commit their careers to doing good without mentioning EA? In some sense, using the blanket term “EA” is a blessing and a curse: a blessing in that it’s a very versatile calling card to put in social media bios, introductory blurbs, etc (e.g. ”...I’m really into effective giving...”) but also that it creates huge collateral damage for hit-piece journalists in case anything seriously bad does happen (like when environmentalists would get called dope-smoking hippies).
Like everything, there’s a crucial balancing act here: how can we be rational, but also be highly motivated and aligned? Curious to know people’s thoughts on this one, because I still feel like EA (as a community) is in the early days, and could become so much more.
I agree that all-inclusive resorts are unnecessary and excessive in almost all situations. Not sure what “domestic work” means. However, I worry that a heavy emphasis on frugality could promote feelings of scarcity in some EAs and thereby cause suboptimal decisions. It’s hard for people to do high quality work when they worry obsessively about money.