This is an interesting essay. My thinking is that “coalition norms”, under which politics operate, trade off instrumental rationality against epistemic rationality. I can argue that it’s morally correct from a consequentialist point of view to tell a lie in order to get my favorite politician elected so they will pass some critical policy. But this is a Faustian bargain in the long run, because it sacrifices the epistemology of the group, and causes the people who have the best arguments against the group’s thinking to leave in disgust or never join in the first place.
I’m not saying EAs shouldn’t join political coalitions. But I feel like we’d be sacrificing a lot if the EA movement began sliding toward coalition norms. If you think some coalition is the best one, you can go off and work with that coalition. Or if you don’t like any of the existing ones, create one of your own, or maybe even join one & try to improve it from the inside.
This is an interesting essay. My thinking is that “coalition norms”, under which politics operate, trade off instrumental rationality against epistemic rationality. I can argue that it’s morally correct from a consequentialist point of view to tell a lie in order to get my favorite politician elected so they will pass some critical policy. But this is a Faustian bargain in the long run, because it sacrifices the epistemology of the group, and causes the people who have the best arguments against the group’s thinking to leave in disgust or never join in the first place.
I’m not saying EAs shouldn’t join political coalitions. But I feel like we’d be sacrificing a lot if the EA movement began sliding toward coalition norms. If you think some coalition is the best one, you can go off and work with that coalition. Or if you don’t like any of the existing ones, create one of your own, or maybe even join one & try to improve it from the inside.