A key characteristic of a cult is a single leader who accrues a large amount of trust and is held by themselves and others to be singularly insightful. The LW space gets like that sometimes, less so EA, but they are adjacent communities.
The ability to do new basic work noticing and fixing those flaws is the same ability as the ability to write this document before I published it, which nobody apparently did, despite my having had other things to do than write this up for the last five years or so. Some of that silence may, possibly, optimistically, be due to nobody else in this field having the ability to write things comprehensibly—such that somebody out there had the knowledge to write all of this themselves, if they could only have written it up, but they couldn’t write, so didn’t try. I’m not particularly hopeful of this turning out to be true in real life, but I suppose it’s one possible place for a “positive model violation” (miracle). The fact that, twenty-one years into my entering this death game, seven years into other EAs noticing the death game, and two years into even normies starting to notice the death game, it is still Eliezer Yudkowsky writing up this list, says that humanity still has only one gamepiece that can do that. I knew I did not actually have the physical stamina to be a star researcher, I tried really really hard to replace myself before my health deteriorated further, and yet here I am writing this. That’s not what surviving worlds look like.
I don’t necessarily disagree with this analysis, in fact, I have made similar observations myself. But the social dynamic of it all pattern-matches to cult-like, and I think that’s a warning sign we should be wary of as we move forward. In fact, I think we should probably have an ongoing community health initiative targeted specifically at monitoring signs of group-think and other forms of epistemic failure in the movement.
A key characteristic of a cult is a single leader who accrues a large amount of trust and is held by themselves and others to be singularly insightful. The LW space gets like that sometimes, less so EA, but they are adjacent communities.
Recently, Eliezer wrote
I don’t necessarily disagree with this analysis, in fact, I have made similar observations myself. But the social dynamic of it all pattern-matches to cult-like, and I think that’s a warning sign we should be wary of as we move forward. In fact, I think we should probably have an ongoing community health initiative targeted specifically at monitoring signs of group-think and other forms of epistemic failure in the movement.