I think on the racism fron Yarrow is referring to the perception that the reason Moskowtiz wonât fund rationalist stuff is because either he thinks that a lot of rationalist believe Black people have lower average IQs than whites for genetic reasons, or he thinks that other people believe that and doesnât want the hassle. I think that belief genuinely is quite common among rationalists, no? Although, there are clearly rationalists who donât believe it, and most rationalists are not right-wing extremists as far as I can tell.
My impression is that Dustin Moskovitz filed for divorce with the LessWrong community due to its racism because Moskovitz announced the decision in the wake of the infamous Manifest conference in 2024 and when he discussed the decision on the EA Forum, he seemed to refer to or allude to the conference as the straw that broke the camelâs back.
Sure, and do you want to stand on any of those accusations? I am not going to argue the point with 2 blogposts. What is the point you think is the strongest?
As for Moskovitz, he can do as he wishes, but I think it was an error. I do think that ugly or difficult topics should be discussed and I donât fear that. LessWrong, and Manifest, have cut okay lines through these topics in my view. But itâs probably too early to judge.
Well, the evidence is there if youâre ever curious. You asked for it, and I gave it.
David Thorstad, who writes the Reflective Altruism blog, is a professional academic philosopher and, until recently, was a researcher at the Global Priorities Institute at Oxford. He was an editor of the recent Essays on Longtermism anthology published by Oxford University Press, which includes an essay co-authored by Will MacAskill, as well as essays by a few other people well-known in the effective altruism community and the LessWrong community. He has a number of published academic papers on rationality, epistemology, cognition, existential risk, and AI. Heâs also about as deeply familiar with the effective altruist community as itâs possible for someone to be, and also has a deep familiarity with the LessWrong community.
In my opinion, David Thorstad has a deeper understanding of the EA communityâs ideas and community dynamics than many people in the community do, and, given the overlap between the EA community and the LessWrong community, his understanding also extends to a significant degree to the LessWrong community as well. I think people in the EA community are accustomed to drive-by criticisms by people who have paid minimal attention to EA and its ideas, but David has spent years interfacing with the community and doing both academic research and blogging related to EA. So, what he writes are not drive-by criticisms and, indeed, apparently a number of people in EA listen to him, read his blog posts and academic papers, and take him seriously. All this to say, his work isnât something that can be dismissed out of hand. His work is the kind of scrutiny or critical appraisal that people in EA have been saying they want for years. Here it is, so folks better at least give it a chance.
To me, âugly or difficult topics should be discussedâ is an inaccurate euphemism. I donât think the LessWrong community is particularly capable of or competent at discussing ugly or difficult topics. I think they shy away from the ugly and difficult parts, and generally donât have the stomach or emotional stamina to sit through the discomfort. What instead is happening in the LessWrong community is people are credulously accepting ugly, wrong, evil, and stupid ideas in some part due to an inability to handle the discomfort of scrutinizing them and in large part due to just an ideological trainwreck of a community that believes ridiculous stuff all the time (like the many examples I gave above) and typically has atrocious epistemic practices (e.g. people just guess stuff or believe stuff based on a hunch without Googling it; the community is extremely insular and fiercely polices the insider/âoutsider boundary â landing on the right side of that boundary is sometimes what even determines whether people keep their job, their friends, their current housing, or their current community).
I think on the racism fron Yarrow is referring to the perception that the reason Moskowtiz wonât fund rationalist stuff is because either he thinks that a lot of rationalist believe Black people have lower average IQs than whites for genetic reasons, or he thinks that other people believe that and doesnât want the hassle. I think that belief genuinely is quite common among rationalists, no? Although, there are clearly rationalists who donât believe it, and most rationalists are not right-wing extremists as far as I can tell.
The philosopher David Thorstad has extensively documented racism in the LessWrong community. See these two posts on his blog Reflective Altruism:
âHuman biodiversity (Part 2: Manifest)â (June 27, 2024)
âHuman Biodiversity (Part 7: LessWrong)â (April 18, 2025)
My impression is that Dustin Moskovitz filed for divorce with the LessWrong community due to its racism because Moskovitz announced the decision in the wake of the infamous Manifest conference in 2024 and when he discussed the decision on the EA Forum, he seemed to refer to or allude to the conference as the straw that broke the camelâs back.
Sure, and do you want to stand on any of those accusations? I am not going to argue the point with 2 blogposts. What is the point you think is the strongest?
As for Moskovitz, he can do as he wishes, but I think it was an error. I do think that ugly or difficult topics should be discussed and I donât fear that. LessWrong, and Manifest, have cut okay lines through these topics in my view. But itâs probably too early to judge.
Well, the evidence is there if youâre ever curious. You asked for it, and I gave it.
David Thorstad, who writes the Reflective Altruism blog, is a professional academic philosopher and, until recently, was a researcher at the Global Priorities Institute at Oxford. He was an editor of the recent Essays on Longtermism anthology published by Oxford University Press, which includes an essay co-authored by Will MacAskill, as well as essays by a few other people well-known in the effective altruism community and the LessWrong community. He has a number of published academic papers on rationality, epistemology, cognition, existential risk, and AI. Heâs also about as deeply familiar with the effective altruist community as itâs possible for someone to be, and also has a deep familiarity with the LessWrong community.
In my opinion, David Thorstad has a deeper understanding of the EA communityâs ideas and community dynamics than many people in the community do, and, given the overlap between the EA community and the LessWrong community, his understanding also extends to a significant degree to the LessWrong community as well. I think people in the EA community are accustomed to drive-by criticisms by people who have paid minimal attention to EA and its ideas, but David has spent years interfacing with the community and doing both academic research and blogging related to EA. So, what he writes are not drive-by criticisms and, indeed, apparently a number of people in EA listen to him, read his blog posts and academic papers, and take him seriously. All this to say, his work isnât something that can be dismissed out of hand. His work is the kind of scrutiny or critical appraisal that people in EA have been saying they want for years. Here it is, so folks better at least give it a chance.
To me, âugly or difficult topics should be discussedâ is an inaccurate euphemism. I donât think the LessWrong community is particularly capable of or competent at discussing ugly or difficult topics. I think they shy away from the ugly and difficult parts, and generally donât have the stomach or emotional stamina to sit through the discomfort. What instead is happening in the LessWrong community is people are credulously accepting ugly, wrong, evil, and stupid ideas in some part due to an inability to handle the discomfort of scrutinizing them and in large part due to just an ideological trainwreck of a community that believes ridiculous stuff all the time (like the many examples I gave above) and typically has atrocious epistemic practices (e.g. people just guess stuff or believe stuff based on a hunch without Googling it; the community is extremely insular and fiercely polices the insider/âoutsider boundary â landing on the right side of that boundary is sometimes what even determines whether people keep their job, their friends, their current housing, or their current community).