How would you feel about reposting this in EAs for Political Tolerance (https://www.facebook.com/groups/159388659401670) ? I’d also be happy to repost it for you if you’d prefer.
I’m honestly a bit flummoxed here. Why would contributing to a Facebook group explicitly aligned with one side of this dispute help avoid a split?
The group is still new, so it’s still unclear exactly how it’ll turn out. But I don’t think that’s a completely accurate way of characterisating the group. I expect that there are two main strands of thought within the group—some see themselves as fighting against woke tendencies, whilst others are more focused on peace-making and want to avoid taking a side.
“On the other hand, we’ve had quite a bit of anti-cancel-culture stuff on the Forum lately. There’s been much more of that than of pro-SJ/pro-DEI content, and it’s generally got much higher karma. I think the message that the subset of EA that is highly active on the Forum generally disapproves of cancel culture has been made pretty clearly”
Perhaps. However, this post makes specific claims about ACE. And even though these claims have been discussed somewhat informally on Facebook, this post provides a far more solid writeup. So it does seem to be making a signficantly new contribution to the discussion and not just rewarming leftovers.
It would have been better if Hypatia had emailed the organisation ahead of time. However, I believe ACE staff members might have already commented on some of these issues (correct me if I’m wrong). And it’s more of a good practise than something than a strict requirement—I totally understand the urge to just get something out of there.
“I’m sceptical that further content in this vein will have the desired effect on EA and EA-adjacent groups and individuals who are less active on the Forum, other than to alienate them and promote a split in the movement, while also exposing EA to substantial PR risk”
On the contrary, now that this has been written up on the forum it gives people something to link to. So forum posts aren’t just read by people who regularly read the forum. In any case, this kind of high quality write-up is unlikely to have a significnat effect on alienating people compared to some of the lower quality discussions on these topics that occur in person or on Facebook. So, from my perspective it doesn’t really make any sense to be focusing on this post. If you want to avoid a split in the movement, I’d like to encourage you to join the Effective Altruists for Political Tolerance Facebook group and contribute there.
I would also suggest worrying less about PR risks. People who want to attack EA can already go around shouting about ‘techno-capitalists’, ‘overwhelmingly white straight males’, ‘AI alarmists’, ect. If someone wants to find something negative, they’ll find something negative.
Part of my model is that there is decreasing marginal utility as you invest more effort in one form of outreach, so there can be significant benefit in investing some resources in investing small amounts of resources in alternate forms of outreach.
I hope you find finding to pay someone to organise this as I suspect this program could be extremely impactul.
I would also love to see some amount of prize money funded for this. I wouldn’t be surprised if a relatively small amount of money by philanthropic standards could tempt more of the top debaters to enter.
I actually found the Facebook group very difficult to search for—link is here.
Making a Wiki successful is always about seeding content. There’s a lot of past content that could be copied over and updated, but it’s not pleasant work, so it’s good that Pablo has a grant.
As an addendum: First, suppose you compare a group of random people from the same demographic to a random groups of people from different demographics. Next suppose you compare a group of random lawyers to a group of random laywers of different demogaphics. I would suggest that in the second case the increase in diversity from adding demographic diversity would be significantly reduced as the bar to becoming a lawyer would filter out a lot of diversity of experiences from the first case. For example, a greater proportion of African Americans experience poverty than the general population, but the difference among those who become laywers would be much less.
“They were founded under the premise that conservative viewpoints are underrepresented in scientific discourse”—that’s definitely a possibility, although I suspect that for research into underrepresented groups in general almost all research will have been conducted by people withn strong pre-existing beliefs about whether or not such a group is underrepresented.
I think there’s value in considering people’s possible psychological motivations, but I find it more helpful to consider these for all parties. In such a conversation, the rich could very well be afraid of losing their privilege and the poor could very well be jealous or resentful.
It was a general comment how this lens is often applied in practise, even though this isn’t the only possible way for it to be applied.
“As we cannot measure the diversity of perspectives of a person directly, our best proxy for it is demographic diversity”
Demographic diversity is a useful proxy and may add something additional even if we did have diversity of general philosophy. However, we can measure diversity of perspectives directly, ie. by running surveys like Heterodox Academy has.
“The answer here is that objectivity is not something that a single person has, but that objectivity is a social achievement of a diverse community”
Feminism offers some valuable lens, but I feel it often leads to a hyperfocus on the underprivileged. Suppose we’re discussing raising taxes on the rich, it might be useful to have a rich guy in a room. They might share some useful perspectives like, “It won’t change the behaviour of my friends one bit. Most of us won’t even notice. Our accountants handled our taxes, so we have no idea how much we’re paying” or “If the California tax law passes, I’m headed to Texas”. They might lie, but that’s true of everyone. They might be biased, but the poor are likely to be biased as well.
I’m not claiming this is equally important as representing the perspectives of the poor, just that we shouldn’t be hyperfocused.
I also think charity science might have tried getting people to pledge in their wills.
Yeah, hopefully at some point I find time to make another post, linking to various aspects of what I’d define as the community. I guess who is in or not is not well-defined as it’s not really a single community. Rather, it’s a bunch of groups with similar kinds of people who seem to be talking to each and talking about similar kinds of things, most of whom I think would agree that they’re doing something like sensemaking.
Regarding your second question, if you head over to the Stoa or listen to Both/And, you’ll see people from across the spectrum, although not really many strong social justice proponents. I suppose my suspicion is mainly driven by the intuition that ending the culture wars requires a movement with positive content of its own and not merely a negative critique as Quillette and (to a lesser degree) Persuasion seem to do. People need a reason to join apart from simply being sick of the culture wars.
Yeah, I agree that there would be significant benefits to trying to set up another academic research institute at a university more focused on economics.
This is full, but it’s worth getting people to subscribe for the future
Hmm… often I think it is nice to have a standard term for a phenomenon so that people don’t have to figure out how to express a certain concept each time and then hope that everyone else can follow. Language also has the advantage that insofar as we convince people to adopt our language, we draw them into our worldview.
This should really be a Wiki page instead since these lists (I even made one myself in the past) always become outdated.
This is a really challenging situation—I could honestly see myself leaning either way on this kind of scenario. I used to lean a lot more towards saying whatever I thought was true and ignoring the consequences, but lately I’ve been thinking that it’s important to pick your battles.
I think the key sentence is this one—“On many subjects EAs rightfully attempt to adopt a nuanced opinion, carefully and neutrally comparing the pros and cons, and only in the conclusion adopting a tentative, highly hedged, extremely provisional stance. Alas, this is not such a subject.”
What seems more important to me is not necessarily these kinds of edge cases, but that we talk openly about the threat potentially posed. Replacing the talk with a discussion about cancel culture instead seems like it could have been a brilliant Jiu Jitsu move. I’m actually much more worried about what’s been going on with ACE than anything else.