Upvoted for starting an interesting and probing conversation. I do have several nitpicks.
Perhaps the most common criticism of EA is that the movement does not collectively align with radical anticapitalist politics
Maybe I’ve just stopped paying attention to basic criticisms of EA along these lines, because every time all the best responses from EA to these criticisms are produced in an attempt at a good-faith debate, the critics apparently weren’t interested in an actually serious dialogue that could change EA. Yet in the last couple years while the absolute amount of anticapitalism has increased, I’ve noticed less criticism of EA on the grounds it’s not anticapitalist enough. I think EA has begun to have a cemented reputation as a community that is primarily left-leaning, and certainly welcomes anticapitalist thought, but won’t on the whole mobilize towards anticapitalist activism at least until anticapitalist movements themselves produce effective means of ‘systemic change.’
An autistic rights activist condemned EA by alleging incompatibility between cost-benefit analysis and disability rights
I’m skeptical friction between EA and actors who misunderstand so much has consequences bad enough to worry about, since I don’t expect the criticism would be taken so seriously by anyone else to the point it would have much of an impact at all.
Key EA philosopher Peter Singer has been viewed negatively by left-wing academia after taking several steps to promote freedom of speech (Journal of Controversial Ideas, op-ed in defense of Damore)
Key EA philosopher Peter Singer was treated with hostility by left-wing people for his argument on sex with severely cognitively disabled adults
Peter Singer has been treated with hostility by traditional conservatives for his arguments on after-birth abortion and zoophilia
I’m also concerned about the impact of Singer’s actions on EA itself, but I’d like to see more focused analysis exploring what the probable impacts of controversies around Singer are.
MacAskill’s interview with Joe Rogan provoked hostility from viewers because of an offhand comment/joke he made about Britain deserving punishment for Brexit
William MacAskill received pushback from right-wing people for his argument in favor of taking refugees
Ditto my concerns about controversies surrounding Singer for Will as well, although I am generally much less concerned with Will than Singer.
Useful x-risk researchers, organizations and ideas are frequently viewed negatively by leftists inside and outside academia
I know some x-risk reducers who think a lot of left-wing op-eds are beginning to create a sentiment in some relevant circles that a focus on ‘AI alignment as an existential risk’ is a pie-in-the-sky, rich techie white guy concern about AI safety, and more concern should be put on how advances in AI will affect issues of social justice. The concern is diverting the focus of AI safety efforts away from how AGI poses an existential risk for what are perceived as more parochial concerns could be grossly net negative.
Impacts on existential risk:
None yet, that I can think of
Depending on what considers an x-risk, popular support for right-wing politicians that pursue counterproductive climate change or other anti-environmental policies, or who tend to be more hawkish, jingoistic, and nationalistic in ways that will increase the chances of great-power conflict, negatively impacts x-risk reduction efforts. It’s not clear that this has a direct impact on any EA work focused on x-risks, though, which is the kind of impacts you meant to assess.
Left-wing political culture seems to be a deeper, more pressing source of harm.
I understand you provided a caveat, but I think this take still misses a lot.
If you asked a lot of EAs, I think most of them would say right-wing political culture poses a deeper potential source of harm to EA than left-wing political culture. Left-wing political culture is only a more pressing source of harm because EA is disproportionately left-leaning, so the social networks EAs run in, and thus decision-making in EA, are more likely to be currently impacted by left-wing political culture.
It misses what counts as ‘left-wing political culture,’ especially in Anglo-American discourse, as the left-wing landscape is rapidly and dramatically shifting. While most EAs are left-leaning, and a significant minority would identify with the basket socialist/radical/anti-capitalist/far-left, a greater number, perhaps a plurality, would identify as centre-left/liberal/neoliberal. From the political right, and from other angles, both these camps are ‘left-wing.’ Yet they’re sufficiently different that when accuracy matters, as it does regarding EA, we should use more precise language to differentiate between centre-left/liberal and radical/anticapitalist/far-left ‘left-wing political culture.’ For example, in the U.S., it currently seems the ‘progressive’ political identity can apply to everyone from a neoliberal to a social democrat to a radical anticapitalist. On leftist forums I frequent, liberals are often labelled as ‘centrists’ or ‘right-wing,’ and are perceived as having more in common with conservative and moderates than they do anti-capitalists.
Anecdotally, I would say the grassroots membership of the EA movement is more politically divergent, less moderate, and generallly “to the left” of flagship EA organizations/institutions, in that I talk to a lot of EAs who feel EA is generally still too much to the right for their liking, and actually agree with and wish EA would be much more in line with changes left-wing critics would demand of us.
I’m skeptical friction between EA and actors who misunderstand so much has consequences bad enough to worry about, since I don’t expect the criticism would be taken so seriously by anyone else to the point it would have much of an impact at all.
Assuming that one cares about their definition of “disability rights”—i.e., disabled people have a right to lots of healthcare and social services, and any de-emphasis for the sake of helping more able people is a violation—their criticism and understanding of EA are correct. In the public eye, it’s definitely catchy, this sort of suspicion of utilitarian cost-benefit analysis runs deep. Some weeks ago the opinion journalist Dylan Matthews mentioned that he wanted to write an article about it, and I expect that he would give a very kind platform to the detractors.
Depending on what considers an x-risk, popular support for right-wing politicians that pursue counterproductive climate change or other anti-environmental policies, or who tend to be more hawkish, jingoistic, and nationalistic in ways that will increase the chances of great-power conflict, negatively impacts x-risk reduction efforts. It’s not clear that this has a direct impact on any EA work focused on x-risks, though, which is the kind of impacts you meant to assess.
Upvoted for starting an interesting and probing conversation. I do have several nitpicks.
Maybe I’ve just stopped paying attention to basic criticisms of EA along these lines, because every time all the best responses from EA to these criticisms are produced in an attempt at a good-faith debate, the critics apparently weren’t interested in an actually serious dialogue that could change EA. Yet in the last couple years while the absolute amount of anticapitalism has increased, I’ve noticed less criticism of EA on the grounds it’s not anticapitalist enough. I think EA has begun to have a cemented reputation as a community that is primarily left-leaning, and certainly welcomes anticapitalist thought, but won’t on the whole mobilize towards anticapitalist activism at least until anticapitalist movements themselves produce effective means of ‘systemic change.’
I’m skeptical friction between EA and actors who misunderstand so much has consequences bad enough to worry about, since I don’t expect the criticism would be taken so seriously by anyone else to the point it would have much of an impact at all.
I’m also concerned about the impact of Singer’s actions on EA itself, but I’d like to see more focused analysis exploring what the probable impacts of controversies around Singer are.
Ditto my concerns about controversies surrounding Singer for Will as well, although I am generally much less concerned with Will than Singer.
I know some x-risk reducers who think a lot of left-wing op-eds are beginning to create a sentiment in some relevant circles that a focus on ‘AI alignment as an existential risk’ is a pie-in-the-sky, rich techie white guy concern about AI safety, and more concern should be put on how advances in AI will affect issues of social justice. The concern is diverting the focus of AI safety efforts away from how AGI poses an existential risk for what are perceived as more parochial concerns could be grossly net negative.
Depending on what considers an x-risk, popular support for right-wing politicians that pursue counterproductive climate change or other anti-environmental policies, or who tend to be more hawkish, jingoistic, and nationalistic in ways that will increase the chances of great-power conflict, negatively impacts x-risk reduction efforts. It’s not clear that this has a direct impact on any EA work focused on x-risks, though, which is the kind of impacts you meant to assess.
I understand you provided a caveat, but I think this take still misses a lot.
If you asked a lot of EAs, I think most of them would say right-wing political culture poses a deeper potential source of harm to EA than left-wing political culture. Left-wing political culture is only a more pressing source of harm because EA is disproportionately left-leaning, so the social networks EAs run in, and thus decision-making in EA, are more likely to be currently impacted by left-wing political culture.
It misses what counts as ‘left-wing political culture,’ especially in Anglo-American discourse, as the left-wing landscape is rapidly and dramatically shifting. While most EAs are left-leaning, and a significant minority would identify with the basket socialist/radical/anti-capitalist/far-left, a greater number, perhaps a plurality, would identify as centre-left/liberal/neoliberal. From the political right, and from other angles, both these camps are ‘left-wing.’ Yet they’re sufficiently different that when accuracy matters, as it does regarding EA, we should use more precise language to differentiate between centre-left/liberal and radical/anticapitalist/far-left ‘left-wing political culture.’ For example, in the U.S., it currently seems the ‘progressive’ political identity can apply to everyone from a neoliberal to a social democrat to a radical anticapitalist. On leftist forums I frequent, liberals are often labelled as ‘centrists’ or ‘right-wing,’ and are perceived as having more in common with conservative and moderates than they do anti-capitalists.
Anecdotally, I would say the grassroots membership of the EA movement is more politically divergent, less moderate, and generallly “to the left” of flagship EA organizations/institutions, in that I talk to a lot of EAs who feel EA is generally still too much to the right for their liking, and actually agree with and wish EA would be much more in line with changes left-wing critics would demand of us.
Assuming that one cares about their definition of “disability rights”—i.e., disabled people have a right to lots of healthcare and social services, and any de-emphasis for the sake of helping more able people is a violation—their criticism and understanding of EA are correct. In the public eye, it’s definitely catchy, this sort of suspicion of utilitarian cost-benefit analysis runs deep. Some weeks ago the opinion journalist Dylan Matthews mentioned that he wanted to write an article about it, and I expect that he would give a very kind platform to the detractors.
Right, for that broad sort of thing, I would direct people to my Candidate Scoring System: https://1drv.ms/b/s!At2KcPiXB5rkvRQycEqvwFPVYKHa