Going Too Meta and Avoiding Controversy Carries Risks Too: A Case Study

Foreword/​Update

I wrote this post hurriedly because I wanted to express before I forgot them some related to other conversations in EA from the last couple weeks. I was thinking while I was writing it that I may not get my point across clearly and that suspicion was vindicated by a couple comments on this post.

I’ve also been experimenting with ways of writing posts that are different from the conventional styles on the EA Forum. are definitely ways I could have written this better. I’m kind of getting the sense the reactions I’ve received to a couple posts like this now are converging on a mistake I’m making that I’ll try pivoting away from in the future.

Astral Codex Ten: The Streisand Effect That Keeps on Giving

For those who don’t know, Scott Alexander is a writer and blogger at Astral Codex Ten who is popular among many in effective altruism. Seeing a rise in introspective criticism in EA (happening for reasons beyond the scope of this post), he wrote post that was a criticism of criticism of criticism. For a criticism of EA, from within EA, that Scott didn’t like, he gave the example of Remmelt’s post ‘Some blind spots in effective altruism and rationality.’

Scott clarified multiple times that he likes the people in EA who write criticisms of EA and he thinks they’re good people, so he doesn’t mean anything personal by what he wrote. When asked by a commenter why he couched his own criticism of Remmelt’s criticism in those terms, Scott replied:

I really hate people attacking me online. It makes me miserable. And their attacks tend to be … false. Like if someone accuses me of being greedy, or writing something because of some specific sinister agenda, or something, I usually know why I wrote things and they’re just wrong.

And this blog is read by ~50,000 people. If I say something mean about some normal person without a huge audience, this may be one of the worst things that ever happen to them, in the same way that the NYT saying mean things about me was one of the worst things that ever happened to me.

And all of these people are effective altruists trying to make the world a better place, who are additionally writing their honest criticisms of EA to make it better. I hate that “this person writes a well-intentioned article intended to improve the world” ---> “they get insulted and used as an example of badness on a blog read by 50,000 people and they’re forever known as the person who got this wrong”. I hate that I have to write posts like this at all.

Someone told me they thought Scott had felt a need to write that post because it put an end to the epidemic of excessive criticism in EA of itself. That’s why I was thinking of not writing this post. I know others in EA who feel like the continuing debate over the value of criticizing whatever is a bunch of drama and a waste of time. They’d send me a link to this post called The Virtue of Silence from Scott’s former blog, Slate Star Codex, to discourage me from stoking potential drama further. The point would be for me to not contribute to the Streisand effect. From Wikipedia:

The Streisand effect is a phenomenon that occurs when an attempt to hide, remove, or censor information has the unintended consequence of increasing awareness of that information, often via the Internet.

It turns out the goal of Scott’s post wasn’t to end the endless cycle of criticism in EA, or at least didn’t work, because he brought more attention to it yesterday with another post about that first post! The Virtue of Silence isn’t relevant in this case, according to Scott himself, apparently, so here we are.

Getting back to how apologetic Scott felt a need to be for bringing attention to a specific person, like Remmelt, given that Scott also thought or felt he had to write a post like ‘Criticism of Criticism of Criticism,’ he could have at least saved the part criticizing Remmelt’s post for a response post or comment on the Effective Altruism Forum.

If Scott knows how bad it felt to be on the receiving end of so many personal attacks online from his experience with The New York Times, then he could have avoided unnecessarily putting that risk on Remmelt by mentioning him on his blog with ~50,000 readers!

There are more effective altruists who read Astral Codex Ten than read the EA Forum. There are more people who read Astral Codex Ten than there are effective altruists! If he had posted it somewhere with less of a reach, he would have had less to preemptively apologize for.

Alas, Scott brought it up to his 50,000 readers again anyway! It was okay, though, because it turned out Scott had overestimated that one risk for Remmelt. Remmelt was a good sport who handled himself well in the comments, including this one, worth mentioning because Scott himself highlighted it as one of the best comments on his ‘Criticism of Criticism of Criticism’ post. Unfortunately, the same kind of risk Scott was worried about applied to effective altruism as a community in general as well.

What Happens When You Play Fast and Loose with Narratives About Effective Altruism

Not intentionally, but inadvertently, how Scott presented EA in his post inspired some of the best takes on effective altruism of all time, real bangers, unimpeachable and impeccable insights, like these (emphasis added in all cases for parts that are hilarious and/​or notable).
Exhibit A:

I don’t know precisely what is the problem with EA, because I’m not putting in the epistemic work. But I feel *very* comfortable saying “I know it when I smell it; EA smells like it has good intentions plus toxic delusions and they’re not really listening”.

If you want the best definition I can personally come up with, it’s this: EA peer pressures people into accepting repugnant conclusions. Given that, *of course* it doesn’t want real criticism.

Exhibit B:

In the case of EA, I wouldn’t believe it because EA has the hallmarks of a peer pressure organization, and I think the criticism they’re most likely to discount is “the negative value of peer pressure outweighs the positive value of your work”. That’s not a fully general criticism of organizations; it’s a specific and potentially useful criticism of one type of organization.

I wouldn’t tell a shy conflict averse person not to work for the US Government. But I would tell them to avoid making contact with EA.

Exhibit C:

The idea of eliminating all suffering/​evil in the world is dumb. Suffering is what makes us stronger, builds character, gives us something to fight against, (the hero vs the villain story). I’m not going to say we need more racists, misogynists, or chicken eaters but trying to eliminate all of them is a mistake. We’ve turned ‘no racism’ into a paper clip maximizer… and we should stop.

Parts of these criticisms of EA are ludicrous enough it’s funny but there are parts of them (except the last one) that reiterate some common impressions of EA that are:

  1. if not inaccurate, are imprecise;

  2. based on perceiving problems in EA as a community as inherent to EA as a philosophy;

  3. exactly the kind of misconceptions about EA that the community is constantly trying to to dispel or address, on account of how strongly they might needlessly repel a lot of people who might otherwise participate in EA.

Maybe it’s better discourse like this happens on a public forum away from the EA community. It at least brings impressions of EA like these to the attention of the community, so there is an opportunity to acknowledge, critical points can be acknowledged, but wrong perceptions of EA dispelled. Many effective altruists decently responded to a lot of concerns about EA in the comments on Scott’s post.

On the other hand, the media already creates enough of this kind of work for the EA community 247. For how important the reputation of EA is considered to be, constantly tending to concerns like this in EA takes a lot of time. Nothing Scott said or did would have added created extra work if he had published on the EA Forum. Here, he would have been able to bring the attention to the people he wanted the most to notice what he had to say.

If there was a risk someone who was identified from a post on Astral Codex Ten getting personally attacked online, then there is a similar risk that a lot of people will end up thinking EA totally sucks for reasons Scott wouldn’t himself endorse. It can’t be emphasized how Scott has been, this week, risking contributing to the very kind of problem he wanted to avoid by posting on his blog with 50,000 readers.

It’s Signaling Most of the Way Up

One of the comments Scott highlighted as among the best on his post Zvi, whose Criticism of the Criticism Contest was a partial inspiration(?)[1] for Scott’s post. Zvi began with:

It is the dream of anyone who writes a post called Criticism of [a] Criticism Contest to then have a sort-of reply called Criticism of Criticism of Criticism.

The only question now is, do I raise to 4?

Zvi thankfully proceeds to not do that but that’s not enough. The ultimate solution is to do the opposite: get as close as possible to ratcheting the number of meta levels EA discourse is on down to zero. Why is because going to meta sometimes causes everyone to become detached from and forget the reality of what are the object-level issues that you really, ostensibly, care about.

A lot of the now over 600 comments on Scott’s original post were about:

  1. whether EA is an exclusively utilitarian philosophy.

  2. juxtaposing EA with religion, asking questions like, ‘Is EA compatible with religion?’ or ‘how is EA similar to a religion?’

Meanwhile, Effective Altruism for Christians began as an outreach project years ago that is now incorporated as a non-profit with at least 500 members and 3 annual conferences under its belt. Effective Altruism for Jews is another outreach effort on a similar trajectory. Effective Altruism for Muslims posted yesterday update of their progress after having begun a few months ago. There is a Facebook group called Buddhists in Effective Altruism with almost 400 members.

The Effective Altruism Global conference in San Francisco this weekend. It’s great opportunity for maybe hundreds of effective altruists of different religious backgrounds to connect with peers who share their faith. Yet for all anyone knows, a similar number of religious readers of Astral Codex Ten who otherwise might have liked EA may now think here is no place for them in the community. It could be almost zero of Scott’s religious readers, among those 50,00 readers overall, but the fact that outcome won’t be tracked is part of the problem too.

What You Can Take Away from This Post

This is only one example of many of how so much meta-meta-discourse may inspire thousands of credulous people, both inside and outside of EA, to rampantly speculate, or believe false or misleading information that also hurts EA’s reputation and capacity. Some of these aren’t matters of debate with more research always needed. Some of them are easily verifiable matters of fact.

Any of the examples from Astral Codex Ten are about issues that are much lower stakes than the problems in EA so many don’t want to contribute to making worse. To not hurt anyone’s reputation or feelings, to not stoke drama, to avoid awkwardness—that’s why so much critical or important discourse about EA in the community takes are vague social media or blog posts that never make it to the EA Forum. When misleading criticisms of EA never become legible to the community as a whole, stopping the spread of misinformation can become much harder.

If you’re reading this thinking, “who, me?”, as one of the effective altruists whose Twitter or Facebook posts may sometimes contribute to this problem, answering that question is left as an exercise to the reader.

  1. ^

    I might be confused or wrong about this, so someone please clarify this if you’re in the know.