Think about EA alignment like skill mastery, not cult indoctrination

People doing meta-EA sometimes jokingly frame their work as “I want to indoctrinate people into EA, sort of like what cults do, but don’t worry! Haha! What we do is fine because EA isn’t a cult.”

I think this is a harmful mindset. When thinking about how to get more people to be EA-aligned, I think skill mastery is a better model than cult indoctrination.

Skill mastery often has these attributes:

  • Repeated deliberate practice.

  • “Thinking about it in the shower”, i.e. thinking about it without much effort.

  • “Gears-level understanding”, i.e. knowing the foundations of the skill and understanding how all the pieces relate.

Cult indoctrination is more about gaslighting people into believing that there can be no other truth than X. They do this by repeating talking points and creating closed social circles.

Accordingly, when thinking about how to get more people to be EA-aligned, here are some good questions to ask:

  • Can we build structures which enable repeated deliberate practice of EA? Good examples are intro fellowships, book recommendations, and club meetings. Are there more?

  • Can we get people to “think about EA in the shower”? One way to improve this could be to provide better-written reading materials which pose questions which are amenable to shower thoughts.

  • Can we encourage more “gears-level understanding” of EA concepts? For example, emphasize the reasons behind x-risk calculations rather than their conclusions.

It is also probably a bad idea for EA to resemble a cult, because cults have bad epistemics. Accordingly, here are some paths to avoid going down:

  • Repeating talking points: when discussing EA topics with a skeptical non-EA, don’t repeat standard EA talking points if they’re not resonating. It is useless to say “AI Safety is a pressing problem because superintelligent AGI may pose existential risk” if they do not believe superintelligent AGI could ever possibly exist. Instead, you can have a more intellectually honest conversation by first understanding what their current worldview and model of AI is, and building off of this. In other words it is important to adopt good pedagogy: building from the student’s foundations, rather than instructing them to memorize isolated facts.

  • Closed social circles: for example, in the setting of a university group, it is probably a bad idea to create an atmosphere where people new to EA feel out of place.

The central idea here is that promoting gears-level understanding of EA concepts is important. Gears-level understanding often has repeated deliberate practice and shower thoughts as a prerequisite, so skill mastery and gears-level understanding are closely related goals.

I would rather live in a world with people who have their own sound models of x-risk and other pressing problems, even if they substantially differ from the standard EA viewpoint, than a world of people who are fully on board with the standard EA viewpoints but don’t have a complete mastery of the ideas behind them.

Summary: People who try to get more people to be EA-aligned often use techniques associated with cult indoctrination, such as repeating talking points and creating closed social circles. Instead, I think it is more useful to think about EA-alignment as a skill that a person can master. Accordingly, good techniques to employ are repeated deliberate practice, “thinking about it in the shower”, and promoting gears-level understanding.