I was just thinking about this the other day. In terms of pitching effective altruism, I think it’s best to keep things simple instead of overwhelming people with different concepts. I think we can boil down your moral claims to essentially 3 core beliefs of EA:
Doing good is good. (Defining good)
It is more good to do more good. (Maximization)
Therefore, we ought to do more good. (Moral obligation)
If you buy these three beliefs, great! You can probably consider yourself an effective altruist or at least aligned with effective altruism. Everything else is downstream of these 3 beliefs and up for debate (and EAs excel at debating!).
I was just thinking about this the other day. In terms of pitching effective altruism, I think it’s best to keep things simple instead of overwhelming people with different concepts. I think we can boil down your moral claims to essentially 3 core beliefs of EA:
Doing good is good. (Defining good)
It is more good to do more good. (Maximization)
Therefore, we ought to do more good. (Moral obligation)
If you buy these three beliefs, great! You can probably consider yourself an effective altruist or at least aligned with effective altruism. Everything else is downstream of these 3 beliefs and up for debate (and EAs excel at debating!).