I agree that most EAs identify with consequentialism, and that proportion was likely higher in the past. I also lean consequentialist myself. But that’s not what we disagree about. You move from ‘The majority of EAs lean consequentialist’ to ‘The only ideas EA should consider seriously are utilitarian ones’ - and that I disagree with.
Moral Uncertainty is a book about what to do given there are multiple plausible ethical theories, written by two of EA’s leading lights Toby Ord and Will MacAskill (in addition to Krister Bykvist). Perhaps you could consider it.
I agree that most EAs identify with consequentialism, and that proportion was likely higher in the past. I also lean consequentialist myself. But that’s not what we disagree about. You move from ‘The majority of EAs lean consequentialist’ to ‘The only ideas EA should consider seriously are utilitarian ones’ - and that I disagree with.
Moral Uncertainty is a book about what to do given there are multiple plausible ethical theories, written by two of EA’s leading lights Toby Ord and Will MacAskill (in addition to Krister Bykvist). Perhaps you could consider it.