1. Eugenics and white supremacy are bad ideas, and endorsing or even engaging with these views could lead to very bad outcomes
2. Associating with and endorsing ideas that will lead to very bad outcomes is not a good thing to do for a community dedicated to making the world better.
3. Scott Alexander, in his blog Slate Star Codex, has openly supported eugenics and white supremacy
C. EA should do everything in its power to distance itself from these communities, ideas, and individuals, and should seriously reflect on its relationship with them.
TL;DR: I find it very strange that people who want to make the world better continue to engage with white supremacists and associated ideas. I’ve worked in EA for a long time, known that racism exists in the community for a long time, but the events of the last week surrounding the removal of Slate Star Codex by its author set a new low for the community. For me, this is a good opportunity to reflect on my own complicity in these ideas, and seems like it ought to be an inflection point where this community decides that it actually wants to try to make the world better, and stand against ideas that probably make it worse.
EA should whole-heartedly be against white supremacy. We aren’t going to make the world any better if we aren’t. That should be obvious, and the fact that this idea isn’t obvious to many people in EA is frightening. Right now, I am frightened that engaging with EA has made me complicit with racism, because the community seems very comfortable hosting people who are openly supportive of hate and prejudice. The response to the Slate Star Codex situation has been very disappointing and disheartening. A good rule of thumb might be that when InfoWars takes your side, you probably ought to do some self-reflection on whether the path your community is on is the path to a better world.
***
Earlier this week, the popular blog in the rationalist and EA communities Slate Star Codex took itself offline after the author, who writes under the pen name Scott Alexander, claimed that the New York Times was going to “dox” him “for clicks.” In his view, the NY Times, which he believed was going to publish an article which included his real name, was doxxing him by doing so, as he is a psychiatrist whose patients might learn about his personal life and views. He requested that his followers contact the editor of the article (a woman of color).
I’ve worked at EA organizations for several years, ran an EA organization, and over time, have become deeply concerned how complicit in racism, sexism, and violence the community is. When I first heard of the concept, I was excited by a large community of people dedicated to improving the world, and having the most impact they could on pressing problems. But the experience of watching the EA community endorse someone who openly provided platforms for white supremacists and publicly endorsed eugenics is incredibly disturbing.
A strange and disappointing tension has existed in EA for a longtime. The community is predominantly white and male. Anonymous submitters on the EA Forum have supported ideas like racial IQ differences, which are not only scientifically debunked, but deeply racist. (I know someone is considering arguing with me about whether or not they are debunked. I have nothing to say to you — other people have demonstrated this point more clearly elsewhere.).
Slate Star Codex appears to have been an even greater hotbed of these ideas then the typical level for EA. Scott Alexander has openly endorsed eugenics and Charles Murray, a prominent proponent of racial IQ differences (Alexander identifies with the “hereditarian left”). He was supported by alt-right provocateurs, such as Emil Kirkegaard and Steve Sailer. On the associated subreddits, talk about eugenics, white supremacy, and related topics were a regular feature. These weren’t just “open discussions of ideas,” as they were often framed. These forums included explicit endorsements of literal white supremicist slogans like the 14 words (some of which have been conveniently deleted in the last few days).
The United States is in the middle of unprecedented protests against police brutality towards people of color. The effective altruist community, which claims to be dedicated to making the world better, has mostly ignored this. Few EA organizations have taken even the minimal step of speaking out in support. And, when the NY Times decided to write an article including Scott Alexander’s real name, who again, seems to both endorse and protect white supremicists and is certainly a eugenicist, the community attacked a woman of color on his word.
The fact that people are not disturbed by this turn of events is downright frightening. Notable EAs such as Rob Wiblin of 80,000 Hours and Kelsey Piper of Vox are speaking out in support of Alexander or openly celebrating him. A value in the EA community has always been open engagement with controversial ideas. That is probably a good thing in many cases. That doesn’t mean EA should be giving platforms to people whose vision for the world is downright horrible. The EA community has demonstrated through this event that our current collective vision for the world is not a good one. It’s an oppressive, unkind, and possibly violent one.
To be fully fair, Slate Star Codex is probably more associated with the rationalist community than EA. This probably should make EA very wary of rationalism, and associating with it. And as a whole, this seems like a time for a lot of self-reflection on the part of the EA community. How do we ensure that our community is more accessible to more people? How do we distance ourselves from white supremacy? Is EA really building something good, or reinforcing some very bad harms? This seems like a very good opportunity for the EA community to genuinely reflect and grow. I personally will be taking time to do that reflection, both on my own actions, and whether I can continue to support a community that fails to do so.
Slate Star Codex, EA, and self-reflection
Here is an argument:
1. Eugenics and white supremacy are bad ideas, and endorsing or even engaging with these views could lead to very bad outcomes
2. Associating with and endorsing ideas that will lead to very bad outcomes is not a good thing to do for a community dedicated to making the world better.
3. Scott Alexander, in his blog Slate Star Codex, has openly supported eugenics and white supremacy
C. EA should do everything in its power to distance itself from these communities, ideas, and individuals, and should seriously reflect on its relationship with them.
TL;DR: I find it very strange that people who want to make the world better continue to engage with white supremacists and associated ideas. I’ve worked in EA for a long time, known that racism exists in the community for a long time, but the events of the last week surrounding the removal of Slate Star Codex by its author set a new low for the community. For me, this is a good opportunity to reflect on my own complicity in these ideas, and seems like it ought to be an inflection point where this community decides that it actually wants to try to make the world better, and stand against ideas that probably make it worse.
EA should whole-heartedly be against white supremacy. We aren’t going to make the world any better if we aren’t. That should be obvious, and the fact that this idea isn’t obvious to many people in EA is frightening. Right now, I am frightened that engaging with EA has made me complicit with racism, because the community seems very comfortable hosting people who are openly supportive of hate and prejudice. The response to the Slate Star Codex situation has been very disappointing and disheartening. A good rule of thumb might be that when InfoWars takes your side, you probably ought to do some self-reflection on whether the path your community is on is the path to a better world.
***
Earlier this week, the popular blog in the rationalist and EA communities Slate Star Codex took itself offline after the author, who writes under the pen name Scott Alexander, claimed that the New York Times was going to “dox” him “for clicks.” In his view, the NY Times, which he believed was going to publish an article which included his real name, was doxxing him by doing so, as he is a psychiatrist whose patients might learn about his personal life and views. He requested that his followers contact the editor of the article (a woman of color).
In response, the Slate Star Codex community basically proceeded to harass and threaten to dox both the editor and journalist writing the article. Multiple individuals threatened to release their addresses, or explicitly threatened them with violence. Several prominent EAs such as Peter Singer, Liv Boeree, Benjamin Todd, Anna Soloman, Baxter Bullock, Arden Koehler, and Anders Sandberg have signed a petition calling on the NY Times to not “dox” Scott Alexander, or have spoken out in support of Scott.
I’ve worked at EA organizations for several years, ran an EA organization, and over time, have become deeply concerned how complicit in racism, sexism, and violence the community is. When I first heard of the concept, I was excited by a large community of people dedicated to improving the world, and having the most impact they could on pressing problems. But the experience of watching the EA community endorse someone who openly provided platforms for white supremacists and publicly endorsed eugenics is incredibly disturbing.
A strange and disappointing tension has existed in EA for a longtime. The community is predominantly white and male. Anonymous submitters on the EA Forum have supported ideas like racial IQ differences, which are not only scientifically debunked, but deeply racist. (I know someone is considering arguing with me about whether or not they are debunked. I have nothing to say to you — other people have demonstrated this point more clearly elsewhere.).
Slate Star Codex appears to have been an even greater hotbed of these ideas then the typical level for EA. Scott Alexander has openly endorsed eugenics and Charles Murray, a prominent proponent of racial IQ differences (Alexander identifies with the “hereditarian left”). He was supported by alt-right provocateurs, such as Emil Kirkegaard and Steve Sailer. On the associated subreddits, talk about eugenics, white supremacy, and related topics were a regular feature. These weren’t just “open discussions of ideas,” as they were often framed. These forums included explicit endorsements of literal white supremicist slogans like the 14 words (some of which have been conveniently deleted in the last few days).
The United States is in the middle of unprecedented protests against police brutality towards people of color. The effective altruist community, which claims to be dedicated to making the world better, has mostly ignored this. Few EA organizations have taken even the minimal step of speaking out in support. And, when the NY Times decided to write an article including Scott Alexander’s real name, who again, seems to both endorse and protect white supremicists and is certainly a eugenicist, the community attacked a woman of color on his word.
The fact that people are not disturbed by this turn of events is downright frightening. Notable EAs such as Rob Wiblin of 80,000 Hours and Kelsey Piper of Vox are speaking out in support of Alexander or openly celebrating him. A value in the EA community has always been open engagement with controversial ideas. That is probably a good thing in many cases. That doesn’t mean EA should be giving platforms to people whose vision for the world is downright horrible. The EA community has demonstrated through this event that our current collective vision for the world is not a good one. It’s an oppressive, unkind, and possibly violent one.
To be fully fair, Slate Star Codex is probably more associated with the rationalist community than EA. This probably should make EA very wary of rationalism, and associating with it. And as a whole, this seems like a time for a lot of self-reflection on the part of the EA community. How do we ensure that our community is more accessible to more people? How do we distance ourselves from white supremacy? Is EA really building something good, or reinforcing some very bad harms? This seems like a very good opportunity for the EA community to genuinely reflect and grow. I personally will be taking time to do that reflection, both on my own actions, and whether I can continue to support a community that fails to do so.