Slate Star Codex, EA, and self-reflection

Here is an ar­gu­ment:

1. Eu­gen­ics and white supremacy are bad ideas, and en­dors­ing or even en­gag­ing with these views could lead to very bad outcomes

2. As­so­ci­at­ing with and en­dors­ing ideas that will lead to very bad out­comes is not a good thing to do for a com­mu­nity ded­i­cated to mak­ing the world bet­ter.

3. Scott Alexan­der, in his blog Slate Star Codex, has openly sup­ported eu­gen­ics and white supremacy

C. EA should do ev­ery­thing in its power to dis­tance it­self from these com­mu­ni­ties, ideas, and in­di­vi­d­u­als, and should se­ri­ously re­flect on its re­la­tion­ship with them.


TL;DR: I find it very strange that peo­ple who want to make the world bet­ter con­tinue to en­gage with white supremacists and as­so­ci­ated ideas. I’ve worked in EA for a long time, known that racism ex­ists in the com­mu­nity for a long time, but the events of the last week sur­round­ing the re­moval of Slate Star Codex by its au­thor set a new low for the com­mu­nity. For me, this is a good op­por­tu­nity to re­flect on my own com­plic­ity in these ideas, and seems like it ought to be an in­flec­tion point where this com­mu­nity de­cides that it ac­tu­ally wants to try to make the world bet­ter, and stand against ideas that prob­a­bly make it worse.

EA should whole-heart­edly be against white supremacy. We aren’t go­ing to make the world any bet­ter if we aren’t. That should be ob­vi­ous, and the fact that this idea isn’t ob­vi­ous to many peo­ple in EA is fright­en­ing. Right now, I am fright­ened that en­gag­ing with EA has made me com­plicit with racism, be­cause the com­mu­nity seems very com­fortable host­ing peo­ple who are openly sup­port­ive of hate and prej­u­dice. The re­sponse to the Slate Star Codex situ­a­tion has been very dis­ap­point­ing and dis­heart­en­ing. A good rule of thumb might be that when In­foWars takes your side, you prob­a­bly ought to do some self-re­flec­tion on whether the path your com­mu­nity is on is the path to a bet­ter world.

***

Ear­lier this week, the pop­u­lar blog in the ra­tio­nal­ist and EA com­mu­ni­ties Slate Star Codex took it­self offline af­ter the au­thor, who writes un­der the pen name Scott Alexan­der, claimed that the New York Times was go­ing to “dox” him “for clicks.” In his view, the NY Times, which he be­lieved was go­ing to pub­lish an ar­ti­cle which in­cluded his real name, was doxxing him by do­ing so, as he is a psy­chi­a­trist whose pa­tients might learn about his per­sonal life and views. He re­quested that his fol­low­ers con­tact the ed­i­tor of the ar­ti­cle (a woman of color).

In re­sponse, the Slate Star Codex com­mu­nity ba­si­cally pro­ceeded to ha­rass and threaten to dox both the ed­i­tor and jour­nal­ist writ­ing the ar­ti­cle. Mul­ti­ple in­di­vi­d­u­als threat­ened to re­lease their ad­dresses, or ex­plic­itly threat­ened them with vi­o­lence. Sev­eral promi­nent EAs such as Peter Singer, Liv Bo­eree, Ben­jamin Todd, Anna Solo­man, Bax­ter Bul­lock, Ar­den Koehler, and An­ders Sand­berg have signed a pe­ti­tion call­ing on the NY Times to not “dox” Scott Alexan­der, or have spo­ken out in sup­port of Scott.

I’ve worked at EA or­ga­ni­za­tions for sev­eral years, ran an EA or­ga­ni­za­tion, and over time, have be­come deeply con­cerned how com­plicit in racism, sex­ism, and vi­o­lence the com­mu­nity is. When I first heard of the con­cept, I was ex­cited by a large com­mu­nity of peo­ple ded­i­cated to im­prov­ing the world, and hav­ing the most im­pact they could on press­ing prob­lems. But the ex­pe­rience of watch­ing the EA com­mu­nity en­dorse some­one who openly pro­vided plat­forms for white supremacists and pub­li­cly en­dorsed eu­gen­ics is in­cred­ibly dis­turb­ing.

A strange and dis­ap­point­ing ten­sion has ex­isted in EA for a long­time. The com­mu­nity is pre­dom­i­nantly white and male. Anony­mous sub­mit­ters on the EA Fo­rum have sup­ported ideas like racial IQ differ­ences, which are not only sci­en­tifi­cally de­bunked, but deeply racist. (I know some­one is con­sid­er­ing ar­gu­ing with me about whether or not they are de­bunked. I have noth­ing to say to you — other peo­ple have demon­strated this point more clearly el­se­where.).

Slate Star Codex ap­pears to have been an even greater hotbed of these ideas then the typ­i­cal level for EA. Scott Alexan­der has openly en­dorsed eu­gen­ics and Charles Mur­ray, a promi­nent pro­po­nent of racial IQ differ­ences (Alexan­der iden­ti­fies with the “hered­i­tar­ian left”). He was sup­ported by alt-right provo­ca­teurs, such as Emil Kirkegaard and Steve Sailer. On the as­so­ci­ated sub­red­dits, talk about eu­gen­ics, white supremacy, and re­lated top­ics were a reg­u­lar fea­ture. Th­ese weren’t just “open dis­cus­sions of ideas,” as they were of­ten framed. Th­ese fo­rums in­cluded ex­plicit en­dorse­ments of literal white supremi­cist slo­gans like the 14 words (some of which have been con­ve­niently deleted in the last few days).

The United States is in the mid­dle of un­prece­dented protests against po­lice bru­tal­ity to­wards peo­ple of color. The effec­tive al­tru­ist com­mu­nity, which claims to be ded­i­cated to mak­ing the world bet­ter, has mostly ig­nored this. Few EA or­ga­ni­za­tions have taken even the min­i­mal step of speak­ing out in sup­port. And, when the NY Times de­cided to write an ar­ti­cle in­clud­ing Scott Alexan­der’s real name, who again, seems to both en­dorse and pro­tect white supremi­cists and is cer­tainly a eu­geni­cist, the com­mu­nity at­tacked a woman of color on his word.

The fact that peo­ple are not dis­turbed by this turn of events is down­right fright­en­ing. Notable EAs such as Rob Wiblin of 80,000 Hours and Kel­sey Piper of Vox are speak­ing out in sup­port of Alexan­der or openly cel­e­brat­ing him. A value in the EA com­mu­nity has always been open en­gage­ment with con­tro­ver­sial ideas. That is prob­a­bly a good thing in many cases. That doesn’t mean EA should be giv­ing plat­forms to peo­ple whose vi­sion for the world is down­right hor­rible. The EA com­mu­nity has demon­strated through this event that our cur­rent col­lec­tive vi­sion for the world is not a good one. It’s an op­pres­sive, un­kind, and pos­si­bly vi­o­lent one.

To be fully fair, Slate Star Codex is prob­a­bly more as­so­ci­ated with the ra­tio­nal­ist com­mu­nity than EA. This prob­a­bly should make EA very wary of ra­tio­nal­ism, and as­so­ci­at­ing with it. And as a whole, this seems like a time for a lot of self-re­flec­tion on the part of the EA com­mu­nity. How do we en­sure that our com­mu­nity is more ac­cessible to more peo­ple? How do we dis­tance our­selves from white supremacy? Is EA re­ally build­ing some­thing good, or re­in­forc­ing some very bad harms? This seems like a very good op­por­tu­nity for the EA com­mu­nity to gen­uinely re­flect and grow. I per­son­ally will be tak­ing time to do that re­flec­tion, both on my own ac­tions, and whether I can con­tinue to sup­port a com­mu­nity that fails to do so.