If you accept that improving the long-term value of the future is more important than reducing x-risk
Do you mean “If you accept that improving the long-term value of the future is more important than reducing extinction risk” (as distinct from existential risk more broadly, which already includes other ways of improving the value of the future)?
Or “If you accept that improving the long-term value of the future is more important than reducing the risk of existential catastrophe in the relatively near future?”
I meant to distinguish between long-term efforts and reducing x-risk in the relatively near future (the second case on your list), sorry that was unclear.
I liked this comment.
Do you mean “If you accept that improving the long-term value of the future is more important than reducing extinction risk” (as distinct from existential risk more broadly, which already includes other ways of improving the value of the future)?
Or “If you accept that improving the long-term value of the future is more important than reducing the risk of existential catastrophe in the relatively near future?”
Or something else (e.g., about smaller trajectory changes)?
I meant to distinguish between long-term efforts and reducing x-risk in the relatively near future (the second case on your list), sorry that was unclear.