Hi :) I’m surprised by this post. Doing full-time community building myself, I have a really hard time imagining that any group (or sensible individual) would use these ‘cult indoctrination techniques’ as strategies to get other people interested in EA.
Was wondering if you could share anything more about specific examples / communities where you have found this happening? I’d find that helpful for knowing how to relate to this content as a community builder myself! :-)
(To be clear, I could imagine repeating talking points and closed social circles happening as side effects of other things—more specifically of individuals often not being that good at following what a good argument is and therefore repeating something that seems salient to them, and of people naturally creating social circles with people they get along with. My point is that I find it hard to believe that any of this would be deliberate enough that this kind of criticism really applies! Which is why I’d find examples helpful—to know what we’re specifically speaking about :) )
I also notice that when more experienced EAs tend to talk to new EAs about x-risk from misaligned AI, they tend to present an overly narrow perspective. Sentences like “Some superintelligent AGI is going to grab all the power and then we can do nothing to stop it” are thrown around casually without stopping to examine the underlying assumptions. Then newer EAs repeat these cached phrases without having carefully formed an inside view, and the movement has worse overall epistemics.
Here is a recent example of an EA group having a closed off social circle to the point where a person who actively embraces EA has difficulty fitting in.
Haven’t read the whole post yet but the start of Zvi’s post here lists 21 EA principles which are not commonly questioned.
Hi :) I’m surprised by this post. Doing full-time community building myself, I have a really hard time imagining that any group (or sensible individual) would use these ‘cult indoctrination techniques’ as strategies to get other people interested in EA.
Was wondering if you could share anything more about specific examples / communities where you have found this happening? I’d find that helpful for knowing how to relate to this content as a community builder myself! :-)
(To be clear, I could imagine repeating talking points and closed social circles happening as side effects of other things—more specifically of individuals often not being that good at following what a good argument is and therefore repeating something that seems salient to them, and of people naturally creating social circles with people they get along with. My point is that I find it hard to believe that any of this would be deliberate enough that this kind of criticism really applies! Which is why I’d find examples helpful—to know what we’re specifically speaking about :) )
I should clarify—I think EAs engaging in this behavior are exhibiting cult indoctrination behavior unintentionally, not intentionally.
One specific example would be in my comment here.
I also notice that when more experienced EAs tend to talk to new EAs about x-risk from misaligned AI, they tend to present an overly narrow perspective. Sentences like “Some superintelligent AGI is going to grab all the power and then we can do nothing to stop it” are thrown around casually without stopping to examine the underlying assumptions. Then newer EAs repeat these cached phrases without having carefully formed an inside view, and the movement has worse overall epistemics.
Here is a recent example of an EA group having a closed off social circle to the point where a person who actively embraces EA has difficulty fitting in.
Haven’t read the whole post yet but the start of Zvi’s post here lists 21 EA principles which are not commonly questioned.
I am not going to name the specific communities where I’ve observed culty behavior because this account is pseudoanonymous.