Thanks! Seems like a useful perspective. I’ll pick on the one bit I found unintuitive:
Summary: People who try to get more people to be EA-aligned often use techniques associated with cult indoctrination, such as repeating talking points and creating closed social circles.
In the spirit of not repeating talking points, could you back up this claim, if you meant it literally? This would be big if true, so I want to flag that:
You state this in the summary, but as far as I can see you don’t state/defend it anywhere else in the post. So people just reading the summary might overestimate the extent to which the post argues for this claim.
I’ve seen lots of relevant community building, and I more often see the opposite: people being such nerds that they can’t help themselves from descending into friendly debate, people being sufficiently self-aware that they know their unintuitive/unconventional views won’t convince people if they’re not argued for, and people pouring many hours into running programs and events (e.g. dinners, intro fellowships, and intro-level social events) aimed at creating an open social environment.
When I say “repeating talking points”, I am thinking of:
Using cached phrases and not explaining where they come from.
Conversations which go like
EA: We need to think about expanding our moral circle, because animals may be morally relevant.
Non-EA: I don’t think animals are morally relevant though.
EA: OK, but if animals are morally relevant, then quadrillions of lives are at stake.
(2) is kind of a caricature as written, but I have witnessed conversations like these in EA spaces.
My evidence for this claim comes form my personal experience watching EAs talk to non-EAs, and listen to non-EAs talk about their perception of EA. The total number of data points in this pool is ~20. I would say that I don’t have exceptionally many EA contacts, compared to most EAs, but I do particularly make an effort to seek out social spaces where non-EAs are looking to learn about EA. Thinking back on these experiences, and what conversations went well and what ones didn’t, is what inspired me to write this short post.
Ultimately my anecdotal data can’t make any statistical statements about the EA community at large. The purpose of this post is to more describe two mental models of EA alignment and advocate for the “skill mastery” perspective.
I think both (1) and (2) are sufficiently mild/non-nefarious versions of “repeating talking points” that they’re very different from what people might imagine when they hear “techniques associated with cult indoctrination”—different enough that the latter phrase seems misleading.
(E.g., at least to my ears, the original phrase suggests that the communication techniques you’ve seen involve intentional manipulation and are rare; in contrast, (1) and (2) sound to me like very commonplace forms of ineffective (rather than intentionally manipulative) communication.)
(As I mentioned, I’m sympathetic to the broader purpose of the post, and my comment is just picking on that one phrase; I agree with and appreciate your points that communication along the lines of (1) and (2) happen, that they can be examples of poor communication / of not building from where others are coming from, and that the “skill mastery” perspective could help with this.)
Thanks! Seems like a useful perspective. I’ll pick on the one bit I found unintuitive:
In the spirit of not repeating talking points, could you back up this claim, if you meant it literally? This would be big if true, so I want to flag that:
You state this in the summary, but as far as I can see you don’t state/defend it anywhere else in the post. So people just reading the summary might overestimate the extent to which the post argues for this claim.
I’ve seen lots of relevant community building, and I more often see the opposite: people being such nerds that they can’t help themselves from descending into friendly debate, people being sufficiently self-aware that they know their unintuitive/unconventional views won’t convince people if they’re not argued for, and people pouring many hours into running programs and events (e.g. dinners, intro fellowships, and intro-level social events) aimed at creating an open social environment.
(As an aside, people might find it interesting to briefly check out YouTube videos of actual modern cult tactics for comparison.)
When I say “repeating talking points”, I am thinking of:
Using cached phrases and not explaining where they come from.
Conversations which go like
EA: We need to think about expanding our moral circle, because animals may be morally relevant.
Non-EA: I don’t think animals are morally relevant though.
EA: OK, but if animals are morally relevant, then quadrillions of lives are at stake.
(2) is kind of a caricature as written, but I have witnessed conversations like these in EA spaces.
My evidence for this claim comes form my personal experience watching EAs talk to non-EAs, and listen to non-EAs talk about their perception of EA. The total number of data points in this pool is ~20. I would say that I don’t have exceptionally many EA contacts, compared to most EAs, but I do particularly make an effort to seek out social spaces where non-EAs are looking to learn about EA. Thinking back on these experiences, and what conversations went well and what ones didn’t, is what inspired me to write this short post.
Ultimately my anecdotal data can’t make any statistical statements about the EA community at large. The purpose of this post is to more describe two mental models of EA alignment and advocate for the “skill mastery” perspective.
I think both (1) and (2) are sufficiently mild/non-nefarious versions of “repeating talking points” that they’re very different from what people might imagine when they hear “techniques associated with cult indoctrination”—different enough that the latter phrase seems misleading.
(E.g., at least to my ears, the original phrase suggests that the communication techniques you’ve seen involve intentional manipulation and are rare; in contrast, (1) and (2) sound to me like very commonplace forms of ineffective (rather than intentionally manipulative) communication.)
(As I mentioned, I’m sympathetic to the broader purpose of the post, and my comment is just picking on that one phrase; I agree with and appreciate your points that communication along the lines of (1) and (2) happen, that they can be examples of poor communication / of not building from where others are coming from, and that the “skill mastery” perspective could help with this.)