My theory is that while EA/rationalism is not a cult, it contains enough ingredients of a cult that it’s relatively easy for someone to go off and make their own.
Not everyone follows every ingredient, and many of the ingredients are actually correct/good, but here are some examples:
Devoting ones life to a higher purpose (saving the world)
High cost signalling of group membership (donating large amounts of income)
The use of in-group shibboleths (like “in-group, and “shibboleths”)
The use of weird rituals and breaking social norms (Bayesian updating, “radical honesty”, etc)
A tendency to isolate oneself from non-group members (group houses, EA orgs)
the belief that the world is crazy, but we have found the truth (rationalist thinking)
the following of sacred texts explaining the truth of everything (the sequences)
And even the belief in an imminent apocalypse (AI doom)
These ingredients do not make EA/rationalism in general a cult, because it lacks enforced conformity and control by a leader. Plenty of people, including myself, have posted on Lesswrong critiquing the sequences and Yudkowsky and been massively upvoted for it. It’s decentralised across the internet, if someone wants to leave there’s nothing stopping them.
However, what seems to have happened is that multiple people have taken these base ingredients and just added in the conformity and charismatic leader parts. You put these ingredients in a small company or a group house, put an unethical or mentally unwell leader in charge, and you have everything you need for an abusive cult environment. Now it’s far more difficult to leave, because your housing/income is on the line, and the leader can use the already established breaking of social norms as an excuse to push boundaries and consent in the name of the greater good. This seems to have happened multiple times already.
I don’t know what to do if this theory is correct, besides to take extra scrutiny of leaders of sub-groups within EA, and maybe ease up on unnecessary rituals and jargon.
For what it’s worth, I view the central aim for this post to be to argue against the “cult” model, which I find quite unhelpful.
In my experience the cult model tends to have relatively few mechanistic parts, and mostly seems to put people into some kind of ingroup/outgroup mode of thinking where people make a list of traits, and then somehow magically as something has more of those traits, it gets “worse and scarier” on some kind of generic dimension.
Like, FTX just wasn’t that much of a cult. Early CEA just wasn’t that much of a cult by this definition. I still think they caused enormous harm.
I think breaking things down into “conformity + insecurity + novelty” is much more helpful in understanding what is actually going on.
For example, I really don’t think “rituals and jargon” have much to do what drives people to do more crazy things. Basically all religions have rituals and tons of jargon! So does academia. Those things are not remotely reliable indicators, or are even correlated at all as far as I can tell.
I would like to give this a proper treatment, though that would really take a long time. I think I’ve written about this some in the past in my comments, and I will try to dig them up in the next few days (though if someone else remembers which comments those were, I would appreciate someone else linking them).
(Context for other readers, I worked at CEA in 2015 and 2016, running both EAG 2015 and 2016)
Unfortunately, I think there is a correlation between traits and rituals that are seen as “weird”, and the popups of these bad behaviour.
For example, it seems like far more of these incidence are occurring in the rationalist community than in the EA community, despite the overlap between the two. It also seems like way more is happening in the bay area than in other places.
I don’t think your model sufficiently explains why would be the case, whereas my theory posits the explanation that the bay area rats have a much greater density of “cult ingredients” than, say, the EA meetup in some university in belgium or wherever. People have lives and friends outside of EA, they don’t fully buy into the worldview, etc.
I’m not trying to throw the word “cult” out as a perjorative conversation ender. I don’t think Leverage was a full on religious cult, but it had enough in common with them that it ended up with the same harmful effects on people. I think the rationalist worldview leads to an increased ease of forming these sorts of harmful groups, which does not mean the worldview is inherently wrong or bad, just that you need to be careful.
For example, it seems like far more of these incidence are occurring in the rationalist community than in the EA community, despite the overlap between the two. It also seems like way more is happening in the bay area than in other places.
I don’t think your model sufficiently explains why would be the case
One explanation could simply be that social pressure in non-rationalist EA mostly pushes people towards “boring” practices rather than “extreme” practices.
For example, it seems like far more of these incidence are occurring in the rationalist community than in the EA community, despite the overlap between the two. It also seems like way more is happening in the bay area than in other places.
I think this premise is false. Both early CEA and FTX indeed had really quite little to do with the rationality community, and early Leverage was also more embedded in EA land than rationality land (though they were in the Bay Area, which is definitely relevant).
Agree that the Bay Area in-general feels kind of destabilizing in a relevant way, though like, again, FTX which is by far the most important datapoint (like, I care about preventing FTX at least 10x more than I care about all the other ones), was not located in the Bay Area and indeed had relatively few Bay Area connections.
I mostly agree with your larger point here, especially about the relative importance of FTX, but early Leverage was far more rationalist than it was EA. As of 2013, Leverage staff was >50% Sequences-quoting rationalists, including multiple ex-SIAI and one ex-MetaMed, compared with exactly one person (Mark, who cofounded THINK) who was arguably more of an EA than a rationalist. Leverage taught at CFAR workshops before they held the first EA Summit. Circa 2013 Leverage donors had strong overlap with SIAI/MIRI donors but not with CEA donors. etc.
I think cults are a useful lens to understand other social institutions, including EA.
I don’t like most “is a cult”/”is not a cult” binaries. There’s of often a wide gradient.
To me, lots of academia does do highly on several cult-like markers. Political parties can do very highly. (Some of the MAGA/Proud Boys stuff comes to mind).
The inner circle of FTX seemed to have some cult-like behaviors, though to me, that’s also true for a list of startups and intense situations.
All that said, I do of course get pretty annoyed when most people criticize EA as being a cult, where I often get the sense that they treat it as a binary, and also are implying something that I don’t believe.
My theory is that while EA/rationalism is not a cult, it contains enough ingredients of a cult that it’s relatively easy for someone to go off and make their own.
Not everyone follows every ingredient, and many of the ingredients are actually correct/good, but here are some examples:
Devoting ones life to a higher purpose (saving the world)
High cost signalling of group membership (donating large amounts of income)
The use of in-group shibboleths (like “in-group, and “shibboleths”)
The use of weird rituals and breaking social norms (Bayesian updating, “radical honesty”, etc)
A tendency to isolate oneself from non-group members (group houses, EA orgs)
the belief that the world is crazy, but we have found the truth (rationalist thinking)
the following of sacred texts explaining the truth of everything (the sequences)
And even the belief in an imminent apocalypse (AI doom)
These ingredients do not make EA/rationalism in general a cult, because it lacks enforced conformity and control by a leader. Plenty of people, including myself, have posted on Lesswrong critiquing the sequences and Yudkowsky and been massively upvoted for it. It’s decentralised across the internet, if someone wants to leave there’s nothing stopping them.
However, what seems to have happened is that multiple people have taken these base ingredients and just added in the conformity and charismatic leader parts. You put these ingredients in a small company or a group house, put an unethical or mentally unwell leader in charge, and you have everything you need for an abusive cult environment. Now it’s far more difficult to leave, because your housing/income is on the line, and the leader can use the already established breaking of social norms as an excuse to push boundaries and consent in the name of the greater good. This seems to have happened multiple times already.
I don’t know what to do if this theory is correct, besides to take extra scrutiny of leaders of sub-groups within EA, and maybe ease up on unnecessary rituals and jargon.
For what it’s worth, I view the central aim for this post to be to argue against the “cult” model, which I find quite unhelpful.
In my experience the cult model tends to have relatively few mechanistic parts, and mostly seems to put people into some kind of ingroup/outgroup mode of thinking where people make a list of traits, and then somehow magically as something has more of those traits, it gets “worse and scarier” on some kind of generic dimension.
Like, FTX just wasn’t that much of a cult. Early CEA just wasn’t that much of a cult by this definition. I still think they caused enormous harm.
I think breaking things down into “conformity + insecurity + novelty” is much more helpful in understanding what is actually going on.
For example, I really don’t think “rituals and jargon” have much to do what drives people to do more crazy things. Basically all religions have rituals and tons of jargon! So does academia. Those things are not remotely reliable indicators, or are even correlated at all as far as I can tell.
Could you elaborate on how early CEA caused enormous harm? I’m interested to hear your thoughts.
I would like to give this a proper treatment, though that would really take a long time. I think I’ve written about this some in the past in my comments, and I will try to dig them up in the next few days (though if someone else remembers which comments those were, I would appreciate someone else linking them).
(Context for other readers, I worked at CEA in 2015 and 2016, running both EAG 2015 and 2016)
Did you have a chance to look at your old comments, btw?
I searched for like 5-10 minutes but didn’t find them. Haven’t gotten around to searching again.
Unfortunately, I think there is a correlation between traits and rituals that are seen as “weird”, and the popups of these bad behaviour.
For example, it seems like far more of these incidence are occurring in the rationalist community than in the EA community, despite the overlap between the two. It also seems like way more is happening in the bay area than in other places.
I don’t think your model sufficiently explains why would be the case, whereas my theory posits the explanation that the bay area rats have a much greater density of “cult ingredients” than, say, the EA meetup in some university in belgium or wherever. People have lives and friends outside of EA, they don’t fully buy into the worldview, etc.
I’m not trying to throw the word “cult” out as a perjorative conversation ender. I don’t think Leverage was a full on religious cult, but it had enough in common with them that it ended up with the same harmful effects on people. I think the rationalist worldview leads to an increased ease of forming these sorts of harmful groups, which does not mean the worldview is inherently wrong or bad, just that you need to be careful.
One explanation could simply be that social pressure in non-rationalist EA mostly pushes people towards “boring” practices rather than “extreme” practices.
I think this premise is false. Both early CEA and FTX indeed had really quite little to do with the rationality community, and early Leverage was also more embedded in EA land than rationality land (though they were in the Bay Area, which is definitely relevant).
Agree that the Bay Area in-general feels kind of destabilizing in a relevant way, though like, again, FTX which is by far the most important datapoint (like, I care about preventing FTX at least 10x more than I care about all the other ones), was not located in the Bay Area and indeed had relatively few Bay Area connections.
I mostly agree with your larger point here, especially about the relative importance of FTX, but early Leverage was far more rationalist than it was EA. As of 2013, Leverage staff was >50% Sequences-quoting rationalists, including multiple ex-SIAI and one ex-MetaMed, compared with exactly one person (Mark, who cofounded THINK) who was arguably more of an EA than a rationalist. Leverage taught at CFAR workshops before they held the first EA Summit. Circa 2013 Leverage donors had strong overlap with SIAI/MIRI donors but not with CEA donors. etc.
What happened at early CEA?
I think cults are a useful lens to understand other social institutions, including EA.
I don’t like most “is a cult”/”is not a cult” binaries. There’s of often a wide gradient.
To me, lots of academia does do highly on several cult-like markers. Political parties can do very highly. (Some of the MAGA/Proud Boys stuff comes to mind).
The inner circle of FTX seemed to have some cult-like behaviors, though to me, that’s also true for a list of startups and intense situations.
I did like this test, where you can grade groups on a scale to see how “cult-like” they are.
http://www.neopagan.net/ABCDEF.html
All that said, I do of course get pretty annoyed when most people criticize EA as being a cult, where I often get the sense that they treat it as a binary, and also are implying something that I don’t believe.