The ones that walk away

Alice: I’ve grown disillusioned with the EA community. I still want to dedicate my life to doing as much good as I can, but I am no longer certain that EA is the best way to do that.

Bob: I see where you’re coming from, but where specifically is that disillusionment leading you?

Alice: I am still confident that major EA causes are important areas to work on. I think EA organizations do good work in those areas, so I would be quite happy to work at some of them. On the other hand, I’m much less willing to defer to EA institutions than before, and I’m unlikely to attend EA events or personally associate with EAs. So I imagine mostly disengaging from EA, albeit with some professional interest in EA organizations.

Bob: You’re disentangling different aspects of EA as a community. We are linked first and foremost by our moral commitments, to having a larger moral circle and trying to do the most good for that moral circle. You still hold those commitments. On top of that, we’re also linked by the intellectual commitment to think rigorously and impartially about ways to do the most good. It sounds like you still believe in that, and the change is that you want to do more of that thinking for yourself and less of it through the EA collective consciousness. Is that right?

Alice: Pretty much.

Bob: But if you still hold the moral and intellectual commitments that define effective altruism, why do you want to disengage from EA?

Alice: For me, the social dimension creates a dangerous tribalism. I get upset when people criticize EA on Twitter and in my life, and I feel the need to defend it. My in-group bias is being activated to defend people and arguments that I would not otherwise defend.

Bob: Isn’t being cognizant of tribalism enough to help you avoid it?

Alice: That’s unlikely, at least for me. I’m not a brain in a vat; my emotions are important to me. They don’t dictate every action I take, but they have some sway. Furthermore, everyone thinks they are above tribalism, so we should be skeptical about our ability to succeed where everyone else fails.

Bob: Point taken, but this argument proves too much. This is not just an argument against identifying with EA—it’s an argument against identifying with any collective, since every collective makes you feel some tribalism.

Alice: And that’s exactly what I’m defending. I think it makes sense to work with collectives to accomplish shared goals—as I said, I would still work at EA organizations—but I am much less excited about identifying with them. That shared identity is not necessary for us to do good work together, and it creates a lot of scope for abuse.

Bob: That feels uncomfortably transactional. Can you really work with someone towards a shared goal that is meaningful to you without feeling some bond with them? Don’t you feel kinship with people who care about animal suffering, for example?

Alice: Well… I see what you mean, so I’ll step back from the strong claim. But the EA community is far more tightly knit than that basic moral kinship. We have group houses, co-working spaces, student groups, conferences with afterparties, a CEA community health team, the Forum, Dank EA Memes, EA Twitter… this is not your average community, and the typical EA could probably step back quite a lot while retaining that kinship and the sense of working together to make the world better.

Bob: It’s true that this is a highly-engaged community, but most of those aren’t just for fun; they have some role in our ability to do good. To pick on two examples you listed, I’ve met people at conferences who I learnt a lot from, and the Forum is one of the best websites on the internet if you filter it aggressively. I wouldn’t take this reasoning too literally, but I still suspect that if you disengaged from the Forum and stopped meeting EAs at conferences, it would reduce your impact.

Alice: I’m uncomfortable with that kind of reasoning. “If you engage less with the specific EA movement, you will renege on your moral commitments” sounds like “if you engage less with our specific religious institution, you will renege on your commitment to God”. It could be true, but it’s also a self-serving belief for communities to hold.

Bob: That’s undeniable, and I would never take this argument into the realm of moral blackmail. At the same time, it’s a real possibility. Do you have a specific reason to believe it’s wrong?

Alice: Nope, and I’m comfortable with that.

Bob: I understand your concern, but EA started as a bunch of individuals reading a book, donating some money and going on with life. There are sensible reasons why we quickly grew out of that; it’s just not the way to do the most good. We need to act collectively, to form organizations that scale, and we need to exchange ideas to learn from each other, so I think even the best possible version of the EA community is not far from what we have now.

Alice: Of course EA had to grow, but the question comes back to whether this community is actually realizing the promise of that growth. I don’t want to lean too heavily on recent events, but I think they are actually informative that we have less of a handle on doing good than we thought.

Bob: I think that’s only true because people previously had unusually naive beliefs about EA infallibility. Now that EA infallibility has been smashed to pieces, we are all moving to a new relationship with EA. I’m optimistic that this new relationship will be much healthier than before, and I think you underestimate our ability to push it in that direction. I can personally think of a few Forum posts and conversations with people that have changed my mind about important topics, and with a community as small as ours, it doesn’t take a lot to have an effect. Yes, we can have groupthink and bad equilibria, but we aren’t naive about those things, and we are constantly trying to do better. Given that, I personally can’t be too pessimistic about the future of EA.

Alice: I like that vision, and I have zero interest in dismissing it. But as someone whose unusually naive beliefs about EA were smashed, I don’t think it’s as simple as hitting reload on a save file. I personally need to re-interrogate ways to do good with my life, and whether non-EA frameworks rise to that challenge better than EA has. Maybe the end result of that exploration will be a renewed confidence in EA, and maybe it won’t. Regardless, I need a clean slate, so I am disengaging from the EA community. And maybe that’s also good for my long-term impact because [post-hoc rationalization].

Bob: Well then, godspeed. I hope you find what you’re looking for.