I think censorship would be a bad choice here, because the EA forum hasn’t discussed these concepts previously (in any routine way, I’m sure there is a screed or two that could be dug up from a mound of downvotes) and is unlikely to in the future.
I would agree that race/IQ debates on the EA forum are unlikely to produce anything of value. But it’s my experience that if you have free discussion rights and one banned topic, that causes more issues than just letting people say their piece and move on.
I’d also agree that EA isn’t meant to be a social club for autists—but from a cynical perspective, the blithely curious and alien-brained are also a strategic resource and snubbing them should be avoided when possible.
If people are still sharing takes on race/IQ two weeks from now, I think that would be a measurable enough detraction from the goal of the forum to support the admins telling them to take it elsewhere. But I would be surprised if it were an issue.
Vulnerable EAs also want to follow only good norms while disposing of the bad ones!
If you offer people the heuristic “figure out if it’s reasonable and only obey it if it is” then often they will fail.
You mention clear-cut examples, but oftentimes they will be very grey, or they will seem grey while being inside them. There may be several strong arguments why the norm isn’t a good one; the bad actor will be earnest, apologetic, and trying to let you have your norm even though they don’t believe in it. They may seem like a nice reasonable person trying to do the right thing in an awkward situation.
Following every norm would be quite bad. Socially enforced gendered cosmetics are disgusting and polyamory is pretty nifty.
Nonetheless, we must recognize that the same process that produces “polyamory is pretty nifty” will also produce in many people: “there’s no reason I can’t have a friendly relationship with my employer rather than an adversarial one” (these are the words they will use to describe the situation while living in their employer’s house) and “I can date my boss if we are both ethical about it.”
We must not look down on these people as though we’d never fall for it—everyone has things they’d fall for, no matter how smart they are.
My suggestion is to outsource. Google your situation. Read reddit threads. Talk to friends, DM people who have the same job as you (and who you are certain have zero connection to your boss) - chances are they’ll be happy to talk to someone in the same position.
A few asides, noting that these are basics and noncomplete.
If someone uses the phrase “saving the world” on any level approaching consistent, run. Legitimate people who are working on legitimate problems do not rely on this drama. The more exciting the narrative and the more prominent a role the leader plays in it, the more skeptical you should be.
(Ah, you might say, but facts can’t be too good to be true: they are simply true or false. My answer to that would be the optimizer’s curse.)
If someone compares themselves to Professor Quirrell, run. In a few years, we’ll have enough abusers who identified with him to fill a scrapbook.
If there’s a dumb enough schmuck in EA to compare themselves to Galileo/da Vinci, exit calmly while giggling.
If someone is willing to break a social contract for utilitarian benefit, assume they’ll break other social contracts for personal benefit i.e. sex.
If you are a somewhat attractive woman with unusual epistemic rigor, assume people will try to take advantage of that.
If someone wants unusual investment from you in a relationship, outsource.
If they say they’re uncomfortable with how much you talk to other people, this must be treated as an attempt to subvert you.
Expect to hear “I have a principled objection to lying and am utterly scandalized whenever someone does it” many times, and be prepared to catch that person lying.
If someone pitches you on something that makes you uncomfortable, but for which you can’t figure out your exact objection—or if their argument seems wrong but you don’t see the precise hole in their logic—it is not abandoning your rationality to listen to your instinct.
If someone says “the reputational risks to EA of you publishing this outweigh the benefits of exposing x’s bad behavior. if there’s even a 1% chance that AI risk is real, then this could be a tremendously evil thing to do”, nod sagely then publish that they said that.
Those last two points need a full essay to be conveyed well but I strongly believe them and think they’re important.