I’ll start by saying I absolutely think it’s a terrible idea to try to destroy humanity. I am 100% not saying we should do that. Ok, now that we have that out of the way. If you decide to commit your life to x-risk reduction because there are “trillions of units of potential value in the future”, you are in a bit of sticky situation if someone credibly argues that the expected value of the future is lower if humans become grabby than if they don’t. And that’s ok! It’s still probably one of the highest EV things you can do.
this ^ post is not great. The entire thing basically presupposes that human society is positive, that aliens will not exist, that animals will not re-evolve if we die. I wouldn’t bring this up if not for it being one most upvoted posts on the forum ever (top 5 if you don’t include posts about ea drama).
I’ll start by saying I absolutely think it’s a terrible idea to try to destroy humanity. I am 100% not saying we should do that. Ok, now that we have that out of the way. If you decide to commit your life to x-risk reduction because there are “trillions of units of potential value in the future”, you are in a bit of sticky situation if someone credibly argues that the expected value of the future is lower if humans become grabby than if they don’t. And that’s ok! It’s still probably one of the highest EV things you can do.
And I’ll say it again years later, https://forum.effectivealtruism.org/posts/KDjEogAqWNTdddF9g/long-termism-vs-existential-risk
this ^ post is not great. The entire thing basically presupposes that human society is positive, that aliens will not exist, that animals will not re-evolve if we die. I wouldn’t bring this up if not for it being one most upvoted posts on the forum ever (top 5 if you don’t include posts about ea drama).