The document above argues for a certain level of self-censorship in EA spaces. Some comments have made the leap to the connected issue of group censorship by a moderator. I’ve written up an example of when someone did not self-censor and listed a couple of questions.
Concrete example: In the Christians in EA group, someone (who had AFAIK never posted before) posted a 60-page document. This document outlined his theory that the best EA cause was to support the propogation of Mormonism, because any civilization based on the equality of men and women was doomed to fail (he saw Islam as another viable civilization, but inferior to Mormonism).
He wanted me to debate him point by point in his argument. I was not willing to argue with him, because it was a waste of my time. At some point, the moderators of the group took some action (I can’t recall if they commented to say they didn’t support the post, or if they deleted it).
Questions:
-If someone posts with very significant errors, should the community spend time correcting those errors? Does it matter how much the person has contributed to the community so far?
-If the opportunity cost of correcting the errors is too high, what should we do instead?
-Would it have been more altruistic for the man in question to self-censor?
-What’s the cost of doing nothing when ‘politically incorrect’ posts come up?
Like many other problems that EAs are aware of, the particular incident you described comes from an outlier that drives the mean significantly forward (I of course know who you are talking about, and the fact that many who’ve been in EA for a long time know as well should indicate that this is both rare in terms of % of people yet perhaps not that rare in terms of % of drama it accounts for).
The key insight here is that the long-tail matters. As a rough prior we could anticipate that 80% of the drama will come from 20% of people (in my experience this is even more skewed, to perhaps 98% of drama coming from 2% of people). In which case, advocating for self-censorship in general in the community is stifling and unnecessary for the bulk of people (who already doubt themselves), and desperately necessary for the outliers who just march forward without much self-awareness in some or other controversial direction, as if mandated by a higher power to cause as much drama as possible.
If we recognize that the problem per person follows a long-tail distribution, our strategies should look very different than if it was a kind of normal/Gaussian distribution.
I think just not answering or downvoting is enough. You can point out that there is not sufficient evidence for you to take the time to study this long document which also seemed crackpotty to your quick skim. That is the truth, after all. If the crackpot repeatedly spams, well spamming warrants a ban.
The document above argues for a certain level of self-censorship in EA spaces. Some comments have made the leap to the connected issue of group censorship by a moderator. I’ve written up an example of when someone did not self-censor and listed a couple of questions.
Concrete example: In the Christians in EA group, someone (who had AFAIK never posted before) posted a 60-page document. This document outlined his theory that the best EA cause was to support the propogation of Mormonism, because any civilization based on the equality of men and women was doomed to fail (he saw Islam as another viable civilization, but inferior to Mormonism).
He wanted me to debate him point by point in his argument. I was not willing to argue with him, because it was a waste of my time. At some point, the moderators of the group took some action (I can’t recall if they commented to say they didn’t support the post, or if they deleted it).
Questions: -If someone posts with very significant errors, should the community spend time correcting those errors? Does it matter how much the person has contributed to the community so far? -If the opportunity cost of correcting the errors is too high, what should we do instead? -Would it have been more altruistic for the man in question to self-censor? -What’s the cost of doing nothing when ‘politically incorrect’ posts come up?
Like many other problems that EAs are aware of, the particular incident you described comes from an outlier that drives the mean significantly forward (I of course know who you are talking about, and the fact that many who’ve been in EA for a long time know as well should indicate that this is both rare in terms of % of people yet perhaps not that rare in terms of % of drama it accounts for).
The key insight here is that the long-tail matters. As a rough prior we could anticipate that 80% of the drama will come from 20% of people (in my experience this is even more skewed, to perhaps 98% of drama coming from 2% of people). In which case, advocating for self-censorship in general in the community is stifling and unnecessary for the bulk of people (who already doubt themselves), and desperately necessary for the outliers who just march forward without much self-awareness in some or other controversial direction, as if mandated by a higher power to cause as much drama as possible.
If we recognize that the problem per person follows a long-tail distribution, our strategies should look very different than if it was a kind of normal/Gaussian distribution.
That’s a good point. What kind of strategies do you think we should adopt?
I think just not answering or downvoting is enough. You can point out that there is not sufficient evidence for you to take the time to study this long document which also seemed crackpotty to your quick skim. That is the truth, after all. If the crackpot repeatedly spams, well spamming warrants a ban.