We’ve recently went through a series of intense EA controversies. I get the sense that many EAs have found EA discussion on these to be exhausting and disappointing.
Whatever the details of the current controversies, I think it’s clear there’s improvement to be made. This could be a good time to reflect on what would be useful.
I don’t want to comment on the heated topics of the day, so let’s assume that these changes will only be applies to future topics, no matter what those might be.
“If you’re angry, leave it a week”
Difficult discussions do not need to happen right now. They can be left and then happen slowly. That isn’t dishonest.
I agree and to that end want to block the forum for a week. A problem I see with coming back in a few months and giving a cold take on something is that ~everyone is going to be sick of the topic due to all the hot takes that happened and won’t want to talk in a way that will change anything. Emotionality seems needed to motivate change.
“Emtionality seems needed to motivate change” a great point DC. Some of the more interesting things I’ve been written as been while Angry. In theory if I could have maintained the passion and thought process I would have been better to leave it a week, but in reality if I hadn’t written at the time I wouldn’t have written at all.
Emotions and anger to some degree drive action. Unfortunately we are human.
I brought up something similar in this thread, and appreciated people making the point that anger as a reaction is still informative and might feel appropriate to communicate in the moment.
I think you can have both by communicating your feelings and how you interpreted something while being careful not to jump to judgments on what should happen and who behaved wrongly. I’m basically thinking of the non-violent communication framework, where you’re taught to carefully differentiate between observations, your feelings and needs, and requests, while avoiding making judgements about others because it reliably undermines mutual understanding.
Posts that are specifically intended for debate and/or reaching shared understanding of different viewpoints and utilize a template participants must fill out to participate. The template forces participants to break down their views and clearly identify what they do and do not agree with from other sides, e.g. built around cruxes, and they force concision.
Some comments have attempted to do this in the recent discussions, but have gotten lost amidst all the other discourse.
This seems worth an experiment.
More opinion survey data. This could mean that the EA Forum includes functionality for surveys, or that someone spends a lot of time making and distributing other surveys.
Or just the ability to run polls on shortforms.
Prizes for commenters that do “moderator” activities.
Clarifying the opinions of people.
Politely explaining to difficult people conversational norms.
Making conversation inviting and friendly.
We bring in a professional moderator (like, a marriage therapist), to help oversee some of the discussion online.
I think when CHSP investigated something similar in the past there wasn’t a lot of value found. Can get more details if desired.
I’m interested at least.
Video/audio discussions, where we get a few people representing different sides of an issue and have them discuss things. This would be recorded and shared. (Either publicly, or to a large list of EAs)
(Note: Spencer Greenberg did this a few times)
Yeah, I think it’s plausible that town halls or conversations between a small set of people should be more of a go-to move.
I think there have been some instances of unhelpful behavior for which an Official Moderator Comment might be too heavy-handed, even if not phrased as a warning. The problem with generally encouraging people to call out unhelpful behavior is that the disputants would probably just start weaponing that.
It would be nice if there were a few well-respected, calm-headed people who could (if willing) be semi-formally recognized as non-moderator neutral promoters of Forum norms when a heated topic arose, and feel empowered to point out this kind of behavior. They would need to abstain from taking any substantive position, at least while things were heated, but could decline appointment if they did not want the role for a given controversy for whatever reason.
Yea. I like the idea of more/better moderation. I would note that it’s a pretty thankless+exhausting job (many thanks to the current mods), so one big challenge is finding people strong enough, trusted enough, and willing to do it.
Be clear on things we all agree on.
I think the Bostrom stuff could have gone a lot better if there had been an initial period where we all agreed the original email was really bad and not the sort of thing any of us would send.
I suspect a lot of people were thinking “well obviously it’s a terrible email, that goes without saying”
Yeah I agree. But it turns out it did not in fact go without saying.
Indeed.
Some people seem to disagree even with that, while for others (or for me at least) it indeed seemed that almost everyone agreed, but the old email was never the point.
Including an explicit checkbox to post/comment anonymously could be useful. This would empower users who would otherwise feel uncomfortable expressing themselves (whistleblowers, users who fear social reprisal, etc).
However, it’s arguable that this proposal would reduce users’ sense of ownership of their words, and/or disincentivize users from associating their true identities with their stated beliefs.
EAs read literature into good conversational norms
Modelling being emotive and raw while owning it e.g. this comment
Blog posts on the EA Forum outlining the incentives and reasons for such a heated environment
Did a pass at this: [Shortform here]
Features that contribute to heated discussion on the forum
Interpersonal and Emotional
Fear, on all sides
Political backlash
What other EAs will think of you
Just sometimes the experience of being on the forum
Trying to protect colleagues or friends
Speed as a reaction to having strong opinions, or worrying that others will jump on you
Frustration at having to rehash arguments / protect things that should go without saying
Desire to gain approval / goodwill from people you’d like to hire/fund/etc you in the future
Desire to sound smart
Desire to gain approval / goodwill from your friends, or people you respect
Pattern matching (correctly or not) to conversations you’ve had before and porting over the emotional baggage from them
Sometimes it helps to assume the people you’re talking to are still trying to win their last argument with someone else
Low trust environment
Surprise that something is even a question
I think there’s a nasty feedback loop in tense situations with low trust. (This section by Ozzie Gooen)
People don’t communicate openly their takes on things.
This leads to significant misunderstanding.
This leads to distrust of each other and assumptions of poor intent.
This leads to parties doing more zero-sum or adversarial actions to each other.
When any communication does happen, it’s inspected with a magnifying glass (because of how rare it is). It’s misunderstood (because of how little communication there has been).
The communicators then think, “What’s the point? My communication is misunderstood and treated with hostility.” So they communicate less.
Not tracking being scrupulously truth-telling out of a desire to get less criticism
Not feeling like part of the decision making process, opaqueness of the reasoning of EA leadership
Not understanding how and why decisions that affect you are made
Feeling misunderstood by the public, sometimes feeling wilfully misunderstood
Something to protect / Politics
Trying to protect a norm you think matters
Trying to protect other people you think are being treated unfairly
Trying to create the EA you want by fiat / speech actions
Power / game theoretical desires to have power shift in EA towards preferred distribution
Speed—a sense that the conversation will get away from you otherwise
Organizational politics
An interest in understanding the internals of organizations you’re not part of
An interest in not-sharing the internals of organizations you are part of
More of a common move to invite people you agree with and don’t (probably separate discussions) to small group conversations on messenger/signal/facebook groups to air feelings, thoughts, and hash things out.
Aim for clarity in the object and meta level
I think that a lot of people have been aiming for clarity in their exact views on what bostrom said, and other people have been aiming for clarity on their views on racism.
I think it was fraught to try and have these two discussions at once and it would have been better to first do the racism discussion and then bostrom’s exact views.
Different comments sections or something?
Prediction markets on “What some smart committee will say on this topic, ~5 years from now”. Or, “What opinion will be on the EA Forum on this topic, ~5 years from now.”
An “Invite-only” version of the EA Forum, for things people don’t want to say publicly. That way people could hash out some particularly scary disagreements out in private.
(Also see this post)
That would likely leak unless kept quite small.
There was some discussion of this here: https://forum.effectivealtruism.org/posts/jRJyjdqqtpwydcieK/ea-could-use-better-internal-communications-infrastructure