At its heart, EA seems to naturally tend to promote a few things:
a larger moral circle is better than a smaller one
considered reasoning (“rationality”) is better than doing things for other reasons alone
efficiency in generating outcomes is better than being less efficient, even if it means less appealing at an emotional level
I don’t know that any of this are what EA should promote, and I’m not sure there’s anyone who can unilaterally make the decision of what is normative for EA, so instead I offer these as the norms I think EA is currently promoting in fact, regardless of what anyone thinks EA should be promoting.
At its heart, EA seems to naturally tend to promote a few things:
a larger moral circle is better than a smaller one
considered reasoning (“rationality”) is better than doing things for other reasons alone
efficiency in generating outcomes is better than being less efficient, even if it means less appealing at an emotional level
I don’t know that any of this are what EA should promote, and I’m not sure there’s anyone who can unilaterally make the decision of what is normative for EA, so instead I offer these as the norms I think EA is currently promoting in fact, regardless of what anyone thinks EA should be promoting.