The aim of EA is to do the most good, and to attract people who are open to updating their views on how to carry out this project. If the “weirdness” of EA means that we repel people who aren’t committed to doing the most good, that’s probably a good thing. And, I’d be far more trusting of utilitarians/consequentialists within EA than I would be of those who practice common sense morality (who let millions of people die every year), egoists (who’d save themselves even if it meant millions died), deontologists (who wouldn’t lie to protect me from a totalitarian regime), religious fundamentalists, or people who say they want to nuke San Francisco.
The aim of EA is to do the most good, and to attract people who are open to updating their views on how to carry out this project. If the “weirdness” of EA means that we repel people who aren’t committed to doing the most good, that’s probably a good thing. And, I’d be far more trusting of utilitarians/consequentialists within EA than I would be of those who practice common sense morality (who let millions of people die every year), egoists (who’d save themselves even if it meant millions died), deontologists (who wouldn’t lie to protect me from a totalitarian regime), religious fundamentalists, or people who say they want to nuke San Francisco.