Understanding and acknowledging the subtext of fear
I think a subtext for some of the EA Forum discussions (particularly the more controversial/ideological ones) is that a) often two ideological camps form, b) many people in both camps are scared, c) ideology feeds on fear and d) people often don’t realize they’re afraid and cover it up in high-minded ideals (like “Justice” or “Truth”)[1].
I think if you think other EAs are obviously, clearly Wrong or Evil, it’s probably helpful to
a) realize that your interlocutors (fellow EAs!) are human, and most of them are here because they want to serve the good
b) internally try to simulate their object-level arguments
c) try to understand the emotional anxieties that might have generated such arguments
d) internally check in on what fears you might have, as well as whether (from the outside, looking from 10,000 feet up) you might acting out the predictable moves of a particular Ideology.
e) take a deep breath and a step back, and think about your intentions for communicating.
In the draft of a low-winded post I probably will never publish, I framed it thusly: “High contextualizers are scared. (They may not realize they’re scared, because from the inside, fear often looks like you’re just Right and you’re filled with righteous rage against a shadowy intolerant elite”) Or ” High decouplers are scared. (They may not realize they’re scared, because from the inside, fear often looks like you’re just Right and you’re suffused with the cold clarity of truth against an unenlightened mob)”
Understanding and acknowledging the subtext of fear
I think a subtext for some of the EA Forum discussions (particularly the more controversial/ideological ones) is that a) often two ideological camps form, b) many people in both camps are scared, c) ideology feeds on fear and d) people often don’t realize they’re afraid and cover it up in high-minded ideals (like “Justice” or “Truth”)[1].
I think if you think other EAs are obviously, clearly Wrong or Evil, it’s probably helpful to
a) realize that your interlocutors (fellow EAs!) are human, and most of them are here because they want to serve the good
b) internally try to simulate their object-level arguments
c) try to understand the emotional anxieties that might have generated such arguments
d) internally check in on what fears you might have, as well as whether (from the outside, looking from 10,000 feet up) you might acting out the predictable moves of a particular Ideology.
e) take a deep breath and a step back, and think about your intentions for communicating.
In the draft of a low-winded post I probably will never publish, I framed it thusly: “High contextualizers are scared. (They may not realize they’re scared, because from the inside, fear often looks like you’re just Right and you’re filled with righteous rage against a shadowy intolerant elite”) Or ” High decouplers are scared. (They may not realize they’re scared, because from the inside, fear often looks like you’re just Right and you’re suffused with the cold clarity of truth against an unenlightened mob)”