Claude thinks possible outgroups include the following, which is similar to what I had in mind
Based on the EA Forum’s general orientation, here are five individuals/groups whose characteristic opinions would likely face downvotes:
Effective accelerationists (e/acc) - Advocates for rapid AI development with minimal safety precautions, viewing existential risk concerns as overblown or counterproductive
TESCREAL critics (like Emile Torres, as you mentioned) - Scholars who frame longtermism/EA as ideologically dangerous, often linking it to eugenics, colonialism, or techno-utopianism
Anti-utilitarian philosophers—Strong deontologists or virtue ethicists who reject consequentialist frameworks as fundamentally misguided, particularly on issues like population ethics or AI risk trade-offs
Degrowth/anti-progress advocates—Those who argue economic/technological growth is net-negative and should be reduced, contrary to EA’s generally pro-progress orientation
Left-accelerationists and systemic change advocates—Critics who view EA as a “neoliberal” distraction from necessary revolutionary change, or who see philanthropic approaches as fundamentally illegitimate compared to state redistribution
a) I’m not sure all of those count as someone who would necessarily be an outsider to EA (e.g. Will MacAskill only assigns a 50% probability to consequentialism being correct, and he and others in EA have long emphasized pluralism about normative ethical theories; there’s been an EA system change group on Facebook since 2015 and discourse around systemic change has been happening in EA since before then)
b) Even if you do consider people in all those categories to be outsiders to EA or part of “the out-group”, us/them or in-group/out-group thinking seems like a bad idea, possibly leading to insularity, incuriosity, and overconfidence in wrong views
c) It’s especially a bad idea to not only think in in-group/out-group terms and seek to shut down perspectives of “the out-group” but also to cast suspicion on the in-group/out-group status of anyone in an EA context who you happen to disagree with about something, even something minor — that seems like a morally, subculturally, and epistemically bankrupt approach
You’re shooting the messenger. I’m not advocating for downvoting posts that smell of “the outgroup”, just saying that this happens in most communities that are centered around an ideological or even methodological framework. It’s a way you can be downvoted while still being correct, especially from the LEAST thoughtful 25% of EA forum voters
Please read the quote from Claude more carefully. MacAskill is not an “anti-utilitarian” who thinks consequentialism is “fundamentally misguided”, he’s the moral uncertainty guy. The moral parliament usually recommends actions similar to consequentialism with side constraints in practice.
I probably won’t engage more with this conversation.
Claude thinks possible outgroups include the following, which is similar to what I had in mind
a) I’m not sure all of those count as someone who would necessarily be an outsider to EA (e.g. Will MacAskill only assigns a 50% probability to consequentialism being correct, and he and others in EA have long emphasized pluralism about normative ethical theories; there’s been an EA system change group on Facebook since 2015 and discourse around systemic change has been happening in EA since before then)
b) Even if you do consider people in all those categories to be outsiders to EA or part of “the out-group”, us/them or in-group/out-group thinking seems like a bad idea, possibly leading to insularity, incuriosity, and overconfidence in wrong views
c) It’s especially a bad idea to not only think in in-group/out-group terms and seek to shut down perspectives of “the out-group” but also to cast suspicion on the in-group/out-group status of anyone in an EA context who you happen to disagree with about something, even something minor — that seems like a morally, subculturally, and epistemically bankrupt approach
You’re shooting the messenger. I’m not advocating for downvoting posts that smell of “the outgroup”, just saying that this happens in most communities that are centered around an ideological or even methodological framework. It’s a way you can be downvoted while still being correct, especially from the LEAST thoughtful 25% of EA forum voters
Please read the quote from Claude more carefully. MacAskill is not an “anti-utilitarian” who thinks consequentialism is “fundamentally misguided”, he’s the moral uncertainty guy. The moral parliament usually recommends actions similar to consequentialism with side constraints in practice.
I probably won’t engage more with this conversation.