I don’t think that the moral differences between longtermists and most people in similar circles (e.g. WEIRD) are that relevant, actually. You don’t need to be a longtermist to care about massive technological change happening over the next century. So I think it’s straightforward to say things like “We should try to have a large-scale moral impact. One very relevant large-scale harm is humans going extinct; so we should work on things which prevent it”.
This is what I plan to use as a default pitch for EA from now on.
I don’t think that the moral differences between longtermists and most people in similar circles (e.g. WEIRD) are that relevant, actually. You don’t need to be a longtermist to care about massive technological change happening over the next century. So I think it’s straightforward to say things like “We should try to have a large-scale moral impact. One very relevant large-scale harm is humans going extinct; so we should work on things which prevent it”.
This is what I plan to use as a default pitch for EA from now on.