I also notice that when more experienced EAs tend to talk to new EAs about x-risk from misaligned AI, they tend to present an overly narrow perspective. Sentences like “Some superintelligent AGI is going to grab all the power and then we can do nothing to stop it” are thrown around casually without stopping to examine the underlying assumptions. Then newer EAs repeat these cached phrases without having carefully formed an inside view, and the movement has worse overall epistemics.
Here is a recent example of an EA group having a closed off social circle to the point where a person who actively embraces EA has difficulty fitting in.
Haven’t read the whole post yet but the start of Zvi’s post here lists 21 EA principles which are not commonly questioned.
I should clarify—I think EAs engaging in this behavior are exhibiting cult indoctrination behavior unintentionally, not intentionally.
One specific example would be in my comment here.
I also notice that when more experienced EAs tend to talk to new EAs about x-risk from misaligned AI, they tend to present an overly narrow perspective. Sentences like “Some superintelligent AGI is going to grab all the power and then we can do nothing to stop it” are thrown around casually without stopping to examine the underlying assumptions. Then newer EAs repeat these cached phrases without having carefully formed an inside view, and the movement has worse overall epistemics.
Here is a recent example of an EA group having a closed off social circle to the point where a person who actively embraces EA has difficulty fitting in.
Haven’t read the whole post yet but the start of Zvi’s post here lists 21 EA principles which are not commonly questioned.
I am not going to name the specific communities where I’ve observed culty behavior because this account is pseudoanonymous.