I feel like the controversy over the conference has become a catalyst for tensions in the involved communities at large (EA and rationality).
It has been surprisingly common for me to make what I perceive to be totally sensible point that isn’t even particularly demanding (about, e.g., maybe not tolerating actual racism) and then the “pro truth-seeking faction” seem to lump me together with social justice warriors and present analogies that make no sense whatsoever. It’s obviously not the case that if you want to take a principled stance against racism, you’re logically compelled to have also objected to things that were important to EA (like work by Singer, Bostrom/Savulescu human enhancement stuff, AI risk, animal risk [I really didn’t understand why the latter two were mentioned], etc.). One of these things is not like the others. Racism is against universal compassion and equal consideration of interests (also, it typically involves hateful sentiments). By contrast, none of the other topics are like that.
To summarize, it seems concerning if the truth-seeking faction seems to be unable to understand the difference between, say, my comments, and how a social justice warrior would react to this controversy. (This isn’t to say that none of the people who criticized aspects of Manifest were motivated by further-reaching social justice concerns; I readily admit that I’ve seen many comments that in my view go too far in the direction of cancelling/censorship/outrage.)
Ironically, I think this is very much an epistemic problem. I feel like a few people have acted a bit dumb in the discussions I’ve had here recently, at least if we consider it “dumb” when someone repeatedly fails at passing Ideological Turing Tests or if they seemingly have a bit of black-and-white thinking about a topic. I get the impression that the rationality community has suffered quite a lot defending itself against cancel culture, to the point that they’re now a bit (low-t) traumatized. This is understandable, but that doesn’t change that it’s a suboptimal state of affairs.
Offputting to whom?
If it bothers me, I can assume that some others will react similarly.
You don’t have to be a member of the specific group in question to find it uncomfortable when people in your environment say things that are riling up negative sentiments against that group. For instance, twelve-year-old children are unlikely to attend EA or rationality events, but if someone there talked about how they think twelve-year olds aren’t really people and their suffering matters less, I’d be pissed off too.
All of that said, I’m overall grateful for LW’s existence; I think habryka did an amazing job reviving the site, and I do think LW has overall better epistemic norms than the EA forum (even though I think most of the people who I intellectually admire the most are more EAs than rationalists, if I had to pick only one label, but they’re often people who seem to fit into both communities).
Well we agree that it doesn’t feel great to feel misunderstood.
It has been surprisingly common for me to make what I perceive to be totally sensible point that isn’t even particularly demanding (about, e.g., maybe not tolerating actual racism) and then the “pro truth-seeking faction” seem to lump me together with social justice warriors and present analogies that make no sense whatsoever.
Okay, what does not tolerating actual racism look like to you? What is the specific thing you’re asking for here?
Okay, what does not tolerating actual racism look like to you? What is the specific thing you’re asking for here?
Up until recently, whenever someone criticized rationality or EA for being racist or for supporting racists, I could say something like the following:
“I don’t actually know of anyone in these communities who is racist or supports racism. From what I hear, some people in the rationality community occasionally discuss group differences in intelligence, because this was discussed in writings by Scott Alexander, which a lot of people have read and so it gives them shared context. But I think this doesn’t come from a bad place. I’m pretty sure people who are central to these communities (EA and rationality) would pretty much without exception speak up strongly against actual racists.”
It would be nice if I could still say something like that, but it no longer seems like I can, because a surprising number of people have said things like “person x is quite racist, but [...] interesting ideas.”
I feel like the controversy over the conference has become a catalyst for tensions in the involved communities at large (EA and rationality).
It has been surprisingly common for me to make what I perceive to be totally sensible point that isn’t even particularly demanding (about, e.g., maybe not tolerating actual racism) and then the “pro truth-seeking faction” seem to lump me together with social justice warriors and present analogies that make no sense whatsoever. It’s obviously not the case that if you want to take a principled stance against racism, you’re logically compelled to have also objected to things that were important to EA (like work by Singer, Bostrom/Savulescu human enhancement stuff, AI risk, animal risk [I really didn’t understand why the latter two were mentioned], etc.). One of these things is not like the others. Racism is against universal compassion and equal consideration of interests (also, it typically involves hateful sentiments). By contrast, none of the other topics are like that.
To summarize, it seems concerning if the truth-seeking faction seems to be unable to understand the difference between, say, my comments, and how a social justice warrior would react to this controversy. (This isn’t to say that none of the people who criticized aspects of Manifest were motivated by further-reaching social justice concerns; I readily admit that I’ve seen many comments that in my view go too far in the direction of cancelling/censorship/outrage.)
Ironically, I think this is very much an epistemic problem. I feel like a few people have acted a bit dumb in the discussions I’ve had here recently, at least if we consider it “dumb” when someone repeatedly fails at passing Ideological Turing Tests or if they seemingly have a bit of black-and-white thinking about a topic. I get the impression that the rationality community has suffered quite a lot defending itself against cancel culture, to the point that they’re now a bit (low-t) traumatized. This is understandable, but that doesn’t change that it’s a suboptimal state of affairs.
If it bothers me, I can assume that some others will react similarly.
You don’t have to be a member of the specific group in question to find it uncomfortable when people in your environment say things that are riling up negative sentiments against that group. For instance, twelve-year-old children are unlikely to attend EA or rationality events, but if someone there talked about how they think twelve-year olds aren’t really people and their suffering matters less, I’d be pissed off too.
All of that said, I’m overall grateful for LW’s existence; I think habryka did an amazing job reviving the site, and I do think LW has overall better epistemic norms than the EA forum (even though I think most of the people who I intellectually admire the most are more EAs than rationalists, if I had to pick only one label, but they’re often people who seem to fit into both communities).
Edited to be about 1 thing.
Well we agree that it doesn’t feel great to feel misunderstood.
Okay, what does not tolerating actual racism look like to you? What is the specific thing you’re asking for here?
Up until recently, whenever someone criticized rationality or EA for being racist or for supporting racists, I could say something like the following:
“I don’t actually know of anyone in these communities who is racist or supports racism. From what I hear, some people in the rationality community occasionally discuss group differences in intelligence, because this was discussed in writings by Scott Alexander, which a lot of people have read and so it gives them shared context. But I think this doesn’t come from a bad place. I’m pretty sure people who are central to these communities (EA and rationality) would pretty much without exception speak up strongly against actual racists.”
It would be nice if I could still say something like that, but it no longer seems like I can, because a surprising number of people have said things like “person x is quite racist, but [...] interesting ideas.”