Mostly agree and have found your post insightful, but am not too sure about the ‘confront this a bit’ part. I feel both most EAs and most Rationalists are very solidly on the left (not the radical, SJW fringe, but very clearly left of center, Democratic-leaning). I vaguely remember having read somewhere Tyler Cowen describing EA as ‘what SJW should be like’. Still, I feel that political partisanship and accepting labels is such a generally toxic and counterproductive affair that it is best avoided. And I think there’s probably some inevitable tension inside EA between people who prioritize the search for veracity and effectiveness, and a high degree of respect for the freedom to explore unconventional and inconvenient truth, and others who might lean more towards prioritizing more left-coded practices and beliefs.
So I am actually perhaps less familiar with the distribution of political beliefs in EAs specifically and I’m thinking about rationalist-adjacent communities more at large, and there’s definitely some people more comfortable around some pretty racist stuff than you’d find elsewhere (as someone else quoted—ACX just published a review of Hanania’s book “The origins of woke”, and the book is apparently a big screed against civil rights law. And knowing Hanania, it’s not hard to guess what he’s driving at). So at least there’s a certain tendency in which open-mindedness and willingness to always try to work everything out from first principles can let in some relatively questionable ideas.
I do agree about the problem with political labels. I do worry about whether that position will be tenable if the label of “TESCREAL” takes off in any meaningful way. Labels or not, if the rationalist community writ large gets under sustained political attack from one side of the aisle, natural alliances will be formed and polarization will almost certainly occur.
So I am actually perhaps less familiar with the distribution of political beliefs in EAs specifically and I’m thinking about rationalist-adjacent communities more at large
The results of the ACX survey just came out and allow us to examine political affiliation and alignment both across the whole sample and based on LW / EA ID.
First, the overall sample.
This is a left-right scale: nearly 70% were on the left side of the spectrum.
Political affiliation: mostly liberal, social democratic and libertarian (in that order).
Now looking at LW ID to assess rationalist communities:
Quite similar, but LW ID’d people lean a bit more to the left than the general readership.
In terms of political affiliation, LWers are substantially less conservative, less neo-reactionary, less alt-right and much more both liberal and libertarian.
Now looking at EA ID (though I would not expect EA ID’s ACX respondents to reflect the EA community as a whole: they should be expected to be more ACX and rationalist leaning):
EAs are more left, 16.4% are on the right end of the spectrum, though 9.7% are in the category immediately right of centre and 13.5% in one of the two most centre-right categories, only 2.9% are more right-leaning than that. (That’s still more right-leaning that the general EA Survey sample, which I would expect to be less skewed, which found 2.2% center-right, 0.7% right.
In terms of political affiliation, EAs are overwhelmingly liberal (almost 50% of the sample) followed by social democratic (another 30.5%), with 15.3% libertarians. There are 4% Conservatives and <1% for each of alt-right or neo-reactionary (for context, 3 and 5 respondents respectively), so definitely into lizardman territory.
Thanks, that’s useful! I guess the surprising thing is maybe just that there still are some fairly prominent names in the rationalist space that express obviously very right wing views and that they are generally almost not seen as such (for example Scott Alexander just wrote a review of Hanania’s new book in which I’d say he almost ends up sounding naive by how much he doesn’t simply acknowledge “well, clearly Hanania is barely stopping shy of saying black people are just stupider”, something that Hanania has said openly elsewhere anyway, so it’s barely a mystery that he believes it).
I would need to dig up specific stuff, but in general I’d suggest to just check out his Twitter/X account https://twitter.com/RichardHanania and see what he says. These days it’s completely dominated by discourse on the Palestine protests so it’s hard to dig out anything on race. Mind you, he’s not one to hold a fully stereotypical GOP-aligned package of ideas—he has a few deviations and is secular (so for example pro-choice on abortion; also he’s definitely not antisemitic, in fact he explicitly called himself prosemitic, as he believes Jews to be smarter). But on race I’m fairly convinced he 100% believes in scientific racism from any time he’s talked about it. I don’t want to link any of the opinion pieces around that argue for this (but there’s a fair deal if you want to check them out and try to separate fact from fiction—many point out that he’s sort of switched to some more defensive “bailey” arguments lately, which he seems to do and explicitly advocate for as a strategy in his latest book “The Origins of Woke” too, again see the ACX review). But for some primary evidence, for example, here’s a tweet about how crime can only be resolved by more incarceration and surveillance of black people:
He used to write more explicitly racist stuff under the pseudonym Richard Hoste until a few years ago. He openly admitted this and wrote an apology blog post in which he basically says that he was young and went a bit too far. Now whether this corresponds to a genuine moderation (from extremely right wing to merely strongly socially conservative and anti-woke) is questionable, because it could just as well be a calculated retreat from a motte to a bailey. It’s not wild to consider this possibility given that, again, he explicitly talks about how certain arguments would scare normies too much so it’s better to just present more palatable ones. And after all that is a pretty sound strategy (and one Torres accused EAs of recently re: using malaria bednets as the bailey to draw people into the motte of AI safety, something that of course I don’t quite see as evil as he implies it to be since I think AI safety absolutely is a concern, and the fact that it looks weird to the average person doesn’t make it not so).
At this point from all I’ve seen my belief is that Hanania mostly is a “race realist” who thinks some races are inherently inferior and thus the correct order of things has them working worse jobs, earning less money etc. and all efforts in the opposite direction are unjust and counterproductive. I don’t think he then moves from that to “and they should be genocided”, but that’s not a lot. He still thinks they should be an underclass and for now thinks that the market left to its own devices would make them so, which would be the rightful order of things. That’s the model of him I built, and I find it hard to believe that Scott Alexander for example hasn’t seen all the same stuff.
Mostly agree and have found your post insightful, but am not too sure about the ‘confront this a bit’ part. I feel both most EAs and most Rationalists are very solidly on the left (not the radical, SJW fringe, but very clearly left of center, Democratic-leaning). I vaguely remember having read somewhere Tyler Cowen describing EA as ‘what SJW should be like’. Still, I feel that political partisanship and accepting labels is such a generally toxic and counterproductive affair that it is best avoided. And I think there’s probably some inevitable tension inside EA between people who prioritize the search for veracity and effectiveness, and a high degree of respect for the freedom to explore unconventional and inconvenient truth, and others who might lean more towards prioritizing more left-coded practices and beliefs.
So I am actually perhaps less familiar with the distribution of political beliefs in EAs specifically and I’m thinking about rationalist-adjacent communities more at large, and there’s definitely some people more comfortable around some pretty racist stuff than you’d find elsewhere (as someone else quoted—ACX just published a review of Hanania’s book “The origins of woke”, and the book is apparently a big screed against civil rights law. And knowing Hanania, it’s not hard to guess what he’s driving at). So at least there’s a certain tendency in which open-mindedness and willingness to always try to work everything out from first principles can let in some relatively questionable ideas.
I do agree about the problem with political labels. I do worry about whether that position will be tenable if the label of “TESCREAL” takes off in any meaningful way. Labels or not, if the rationalist community writ large gets under sustained political attack from one side of the aisle, natural alliances will be formed and polarization will almost certainly occur.
The results of the ACX survey just came out and allow us to examine political affiliation and alignment both across the whole sample and based on LW / EA ID.
First, the overall sample.
This is a left-right scale: nearly 70% were on the left side of the spectrum.
Political affiliation: mostly liberal, social democratic and libertarian (in that order).
Now looking at LW ID to assess rationalist communities:
Quite similar, but LW ID’d people lean a bit more to the left than the general readership.
In terms of political affiliation, LWers are substantially less conservative, less neo-reactionary, less alt-right and much more both liberal and libertarian.
Now looking at EA ID (though I would not expect EA ID’s ACX respondents to reflect the EA community as a whole: they should be expected to be more ACX and rationalist leaning):
EAs are more left, 16.4% are on the right end of the spectrum, though 9.7% are in the category immediately right of centre and 13.5% in one of the two most centre-right categories, only 2.9% are more right-leaning than that. (That’s still more right-leaning that the general EA Survey sample, which I would expect to be less skewed, which found 2.2% center-right, 0.7% right.
In terms of political affiliation, EAs are overwhelmingly liberal (almost 50% of the sample) followed by social democratic (another 30.5%), with 15.3% libertarians. There are 4% Conservatives and <1% for each of alt-right or neo-reactionary (for context, 3 and 5 respondents respectively), so definitely into lizardman territory.
Thanks, that’s useful! I guess the surprising thing is maybe just that there still are some fairly prominent names in the rationalist space that express obviously very right wing views and that they are generally almost not seen as such (for example Scott Alexander just wrote a review of Hanania’s new book in which I’d say he almost ends up sounding naive by how much he doesn’t simply acknowledge “well, clearly Hanania is barely stopping shy of saying black people are just stupider”, something that Hanania has said openly elsewhere anyway, so it’s barely a mystery that he believes it).
Could you provide links to those statements by Hanania?
Not a gotcha, I just have barely heard of this guy and from what you say I expect all discourse around him to be a cesspool.
I would need to dig up specific stuff, but in general I’d suggest to just check out his Twitter/X account https://twitter.com/RichardHanania and see what he says. These days it’s completely dominated by discourse on the Palestine protests so it’s hard to dig out anything on race. Mind you, he’s not one to hold a fully stereotypical GOP-aligned package of ideas—he has a few deviations and is secular (so for example pro-choice on abortion; also he’s definitely not antisemitic, in fact he explicitly called himself prosemitic, as he believes Jews to be smarter). But on race I’m fairly convinced he 100% believes in scientific racism from any time he’s talked about it. I don’t want to link any of the opinion pieces around that argue for this (but there’s a fair deal if you want to check them out and try to separate fact from fiction—many point out that he’s sort of switched to some more defensive “bailey” arguments lately, which he seems to do and explicitly advocate for as a strategy in his latest book “The Origins of Woke” too, again see the ACX review). But for some primary evidence, for example, here’s a tweet about how crime can only be resolved by more incarceration and surveillance of black people:
https://twitter.com/RichardHanania/status/1657541010745081857?lang=en-GB
His RationalWiki article has obviously opinions about him, but also a bunch of links to primary sources in the bibliography:
https://rationalwiki.org/wiki/Richard_Hanania
He used to write more explicitly racist stuff under the pseudonym Richard Hoste until a few years ago. He openly admitted this and wrote an apology blog post in which he basically says that he was young and went a bit too far. Now whether this corresponds to a genuine moderation (from extremely right wing to merely strongly socially conservative and anti-woke) is questionable, because it could just as well be a calculated retreat from a motte to a bailey. It’s not wild to consider this possibility given that, again, he explicitly talks about how certain arguments would scare normies too much so it’s better to just present more palatable ones. And after all that is a pretty sound strategy (and one Torres accused EAs of recently re: using malaria bednets as the bailey to draw people into the motte of AI safety, something that of course I don’t quite see as evil as he implies it to be since I think AI safety absolutely is a concern, and the fact that it looks weird to the average person doesn’t make it not so).
At this point from all I’ve seen my belief is that Hanania mostly is a “race realist” who thinks some races are inherently inferior and thus the correct order of things has them working worse jobs, earning less money etc. and all efforts in the opposite direction are unjust and counterproductive. I don’t think he then moves from that to “and they should be genocided”, but that’s not a lot. He still thinks they should be an underclass and for now thinks that the market left to its own devices would make them so, which would be the rightful order of things. That’s the model of him I built, and I find it hard to believe that Scott Alexander for example hasn’t seen all the same stuff.