I donât know exactly how it would work, but we need to get better as a community in excluding extreme right authoritarian people from spaces associated with us. Itâs bad stuff on its merits, itâs uncomfortable for many EAs who are not straight white men (not all necessarily obviously more than literally zero very right-wing people of colour are fine with this stuff, and some are in EA), and it makes me and I suspect other people nervous about publicly identifying with EA. (I donât think we should play down what we believe to be popular, but I do think we should reject/âeject people for believing stuff that is both wrong and bigoted and reputationally toxic.)
I would slightly separate âpro-eugenicsâ stuff from the white nationalism, because âeugenicsâ alas covers a wide variety of importantly different things. Certain ideas associated with âliberal eugenicsâ are quite mainstream in parts of analytic philosophy, whereas writing about the United States becoming too genetically Mexican is thankfully mainstream nowhere reputable. See for example: https://ââplato.stanford.edu/ââentries/ââeugenics/ââ#ArguForLibeEuge One of the âliberal eugenicistsâ discussed there, Julian Savulescu was a titled chair in practical ethics at Oxford. (In fairness, finding Savulescu totally beyond the pale, and being outraged by this is likely also a mainstream opinion in analytic philosophy.) For this reason, I think it is much harder to have a policy of âthrow out eugenicistsâ than it is to have one of âthrow out racistsâ, whatever you think of the substantive merits of the âliberal eugenicsâ position.
As it happens, while I am very queasy about the things the liberal eugenicists say, I think there are probably (realistic, not necessarily actual) circumstances in which I would support genetic enhancement, including for ânon-medicalâ purposes: i.e. in circumstance where I judged the risk of abuse, coercion and increased racism associated historically with this sort of thing was low enough to be outweighed by the benefits. By the standard official definitions any genetic enhancement (actually even in clearly medical contexts) is eugenics.
In my view, the fact that a) some âliberalâ eugenic opinions have relatively decent mainstream clout in academic ethics, and b) substantively, it is genuinely not obvious that *every* view that can be classed as âeugenicsâ under any reasonable definition of the latter taken literally, is a bad view, makes a blanket âexclude eugenicistsâ policy hard to get right. (Though if someone is happy to call themselves a âeugenicistâ I think that is usually a bad sign about their politics/âworldview, even if the apparently only profess mild eugenic opinions.)
On the other hand, you absolutely can just boot everyone who, like Razib Khan, has written for a white nationalist website (and doesnât seem sufficiently repentent: Hananiaâs repentance is insufficient when combined with his tendency to still do things like calling black lawyers he doesnât like (âthese peopleâ) âanimalsâ on twitter.) Itâs easy, just ban them from your events, donât hire them at your orgs. Will there be borderline cases of âwhite nationalist websiteâ? Sure. But that is true for any realistic rule of the form âexclude people who think X from your movementâ, and almost everyone will agree that some such rules are ok. (I.e. exclude Open Hitler fans.) And itâs also true for most rules about other things too. Almost any rule about social/âpolitical stuff requires some skill at judgment to apply and admits borderline cases. (Though writing for borderline white nationalist websites seem bad also.)
None of this is to say I think that everything in EA around eugenics is all fine, by the way. I think support for even liberal eugenics is often-not always-a tell that someone has dodgy political views across the board. I suspect (though cannot prove) that people sometimes present as believing only in non-coercive stuff in this area, when they actually support coercion. Other times, they are good at presenting a general feel of âthis is sensible careful, mainstream stuff, not far-rightâ, whilst not explicitly ruling out support for (even) extensive coercion. I.e. this forum post, which has a lot of âpro-freedomâ vibes, but read carefully seems to explicitly disavow only forced sterilization and murder as coercive interventions, whilst hinting at favoring jailing people for having children with high chance of disease/âdisability: https://ââforum.effectivealtruism.org/ââposts/ââPTCw5CJT7cE6Kx9ZR/ââmost-people-endorse-some-form-of-eugenics (I also seem to recall the author hinting at more extreme views on twitter latter, though I havenât checked.)
âEugenicsâ is the worst word. (Is there any other word in the English language where the connotations diverge so wildly from the literal denotation?) âLiberal eugenicsâ is effectively a scissor-statement to generate utterly unnecessary conflict between low and high decouplers. Imagine if the literal definition of ârapeâ didnât actually include anything about coercion or lack of consent, and then a bunch of sex-positive philosophers described themselves as being in favor of âconsensual rapeâ instead of picking a less inflammatory way of describing being sex-positive. Thatâs eugenics discourse today.
ETA: my point being that it would seem most helpful (both for clear thinking and for avoiding unnecessary conflict) for people to use more precise language when discussing technologically-aided reproductive freedom and technologically-aided reproductive coercion. The two opposites are not the same, just because both involve technology and goal-directedness in relation to reproduction!
we need to get better as a community in excluding extreme right authoritarian people from spaces associated with us. Itâs bad stuff on its merits, itâs uncomfortable for many EAs who are not straight white men (not all necessarily obviously more than literally zero very right-wing people of colour are fine with this stuff, and some are in EA), and it makes me and I suspect other people nervous about publicly identifying with EA.
So you want EA orgs to use their clout to try and push for EA-associated spaces to not allow people that you and some amount of EAs donât like?
I donât want this. CEA can choose who it wants to invite to EAGs (and I think manages to block out extreme right authoritarians pretty well). Other orgs can invite who they want.
I find this desire for control over other people and spaces bad. I predict it has a chilling effect on ideas. A big chunk of thinking about AI (for better or worse) came from people who are at times uncomfortable to me and I guess you. Would you have endorsed not engaging with these people 10 years ago?
Also I just really donât think there were many authoritarian right wingers at these events. Feels like the poster and I went to completely different events?
I find this desire for control over other people and spaces bad.
I think the key words in the text you quoted are âspaces associated with usâ:
If itâs an EA space, then it isnât really an âotherâ space.
If itâs a non-EA space that is somehow being coded by others as an EA space, then itâs reasonable for EA to distance itself from that space and to expect the other organization to make its non-EA nature quite clear.
Imagine there was someone with the same name as me writing vile nonsense on the internet, and others were misattributing it to me and making my life difficult. I would desire a measure of control over that situation, but it wouldnât be to silence speech I find distasteful. It would be to protect my own valid interest in not being associated with that speech.
Hmmmm maybe. But what does distancing mean? Does it mean âsaying we arenât rationalistsâ? That option has always been available to EAs who arenât. Does it mean ânever booking events at lighthavenâ? That seems pretty silencing.
I donât know exactly how it would work, but we need to get better as a community in excluding extreme right authoritarian people from spaces associated with us. Itâs bad stuff on its merits, itâs uncomfortable for many EAs who are not straight white men (not all necessarily obviously more than literally zero very right-wing people of colour are fine with this stuff, and some are in EA), and it makes me and I suspect other people nervous about publicly identifying with EA. (I donât think we should play down what we believe to be popular, but I do think we should reject/âeject people for believing stuff that is both wrong and bigoted and reputationally toxic.)
I would slightly separate âpro-eugenicsâ stuff from the white nationalism, because âeugenicsâ alas covers a wide variety of importantly different things. Certain ideas associated with âliberal eugenicsâ are quite mainstream in parts of analytic philosophy, whereas writing about the United States becoming too genetically Mexican is thankfully mainstream nowhere reputable. See for example: https://ââplato.stanford.edu/ââentries/ââeugenics/ââ#ArguForLibeEuge One of the âliberal eugenicistsâ discussed there, Julian Savulescu was a titled chair in practical ethics at Oxford. (In fairness, finding Savulescu totally beyond the pale, and being outraged by this is likely also a mainstream opinion in analytic philosophy.) For this reason, I think it is much harder to have a policy of âthrow out eugenicistsâ than it is to have one of âthrow out racistsâ, whatever you think of the substantive merits of the âliberal eugenicsâ position.
As it happens, while I am very queasy about the things the liberal eugenicists say, I think there are probably (realistic, not necessarily actual) circumstances in which I would support genetic enhancement, including for ânon-medicalâ purposes: i.e. in circumstance where I judged the risk of abuse, coercion and increased racism associated historically with this sort of thing was low enough to be outweighed by the benefits. By the standard official definitions any genetic enhancement (actually even in clearly medical contexts) is eugenics.
In my view, the fact that a) some âliberalâ eugenic opinions have relatively decent mainstream clout in academic ethics, and b) substantively, it is genuinely not obvious that *every* view that can be classed as âeugenicsâ under any reasonable definition of the latter taken literally, is a bad view, makes a blanket âexclude eugenicistsâ policy hard to get right. (Though if someone is happy to call themselves a âeugenicistâ I think that is usually a bad sign about their politics/âworldview, even if the apparently only profess mild eugenic opinions.)
On the other hand, you absolutely can just boot everyone who, like Razib Khan, has written for a white nationalist website (and doesnât seem sufficiently repentent: Hananiaâs repentance is insufficient when combined with his tendency to still do things like calling black lawyers he doesnât like (âthese peopleâ) âanimalsâ on twitter.) Itâs easy, just ban them from your events, donât hire them at your orgs. Will there be borderline cases of âwhite nationalist websiteâ? Sure. But that is true for any realistic rule of the form âexclude people who think X from your movementâ, and almost everyone will agree that some such rules are ok. (I.e. exclude Open Hitler fans.) And itâs also true for most rules about other things too. Almost any rule about social/âpolitical stuff requires some skill at judgment to apply and admits borderline cases. (Though writing for borderline white nationalist websites seem bad also.)
None of this is to say I think that everything in EA around eugenics is all fine, by the way. I think support for even liberal eugenics is often-not always-a tell that someone has dodgy political views across the board. I suspect (though cannot prove) that people sometimes present as believing only in non-coercive stuff in this area, when they actually support coercion. Other times, they are good at presenting a general feel of âthis is sensible careful, mainstream stuff, not far-rightâ, whilst not explicitly ruling out support for (even) extensive coercion. I.e. this forum post, which has a lot of âpro-freedomâ vibes, but read carefully seems to explicitly disavow only forced sterilization and murder as coercive interventions, whilst hinting at favoring jailing people for having children with high chance of disease/âdisability: https://ââforum.effectivealtruism.org/ââposts/ââPTCw5CJT7cE6Kx9ZR/ââmost-people-endorse-some-form-of-eugenics (I also seem to recall the author hinting at more extreme views on twitter latter, though I havenât checked.)
âEugenicsâ is the worst word. (Is there any other word in the English language where the connotations diverge so wildly from the literal denotation?) âLiberal eugenicsâ is effectively a scissor-statement to generate utterly unnecessary conflict between low and high decouplers. Imagine if the literal definition of ârapeâ didnât actually include anything about coercion or lack of consent, and then a bunch of sex-positive philosophers described themselves as being in favor of âconsensual rapeâ instead of picking a less inflammatory way of describing being sex-positive. Thatâs eugenics discourse today.
ETA: my point being that it would seem most helpful (both for clear thinking and for avoiding unnecessary conflict) for people to use more precise language when discussing technologically-aided reproductive freedom and technologically-aided reproductive coercion. The two opposites are not the same, just because both involve technology and goal-directedness in relation to reproduction!
So you want EA orgs to use their clout to try and push for EA-associated spaces to not allow people that you and some amount of EAs donât like?
I donât want this. CEA can choose who it wants to invite to EAGs (and I think manages to block out extreme right authoritarians pretty well). Other orgs can invite who they want.
I find this desire for control over other people and spaces bad. I predict it has a chilling effect on ideas. A big chunk of thinking about AI (for better or worse) came from people who are at times uncomfortable to me and I guess you. Would you have endorsed not engaging with these people 10 years ago?
Also I just really donât think there were many authoritarian right wingers at these events. Feels like the poster and I went to completely different events?
I think the key words in the text you quoted are âspaces associated with usâ:
If itâs an EA space, then it isnât really an âotherâ space.
If itâs a non-EA space that is somehow being coded by others as an EA space, then itâs reasonable for EA to distance itself from that space and to expect the other organization to make its non-EA nature quite clear.
Imagine there was someone with the same name as me writing vile nonsense on the internet, and others were misattributing it to me and making my life difficult. I would desire a measure of control over that situation, but it wouldnât be to silence speech I find distasteful. It would be to protect my own valid interest in not being associated with that speech.
Hmmmm maybe. But what does distancing mean? Does it mean âsaying we arenât rationalistsâ? That option has always been available to EAs who arenât. Does it mean ânever booking events at lighthavenâ? That seems pretty silencing.