1) Humanity has a >80% chance of completely perishing in the next ~300 years.
2) The expected value of the future is incredibly, ridiculously, high!
The trick is that the expected value of a positive outcome could be just insanely great. Like, dramatically, incredibly, totally, better than basically anyone discusses or talks about.
Expanding to a great deal of the universe, dramatically improving our abilities to convert matter+energy to net well-being, researching strategies to expand out of the universe.
A 20%, or even a 0.002%, chance at a 10^20 outcome, is still really good.
One key question is the expectation of long-term negative[1] vs. long-term positive outcomes. I think most people are pretty sure that in expectation things are positive, but this is less clear.
So, remember:
Just because the picture of X-risks might look grim in terms of percentages, you can still be really optimistic about the future. In fact, many of the people most concerned with X-risks are those *most* optimistic about the future.
The following things could both be true:
1) Humanity has a >80% chance of completely perishing in the next ~300 years.
2) The expected value of the future is incredibly, ridiculously, high!
The trick is that the expected value of a positive outcome could be just insanely great. Like, dramatically, incredibly, totally, better than basically anyone discusses or talks about.
Expanding to a great deal of the universe, dramatically improving our abilities to convert matter+energy to net well-being, researching strategies to expand out of the universe.
A 20%, or even a 0.002%, chance at a 10^20 outcome, is still really good.
One key question is the expectation of long-term negative[1] vs. long-term positive outcomes. I think most people are pretty sure that in expectation things are positive, but this is less clear.
So, remember:
Just because the picture of X-risks might look grim in terms of percentages, you can still be really optimistic about the future. In fact, many of the people most concerned with X-risks are those *most* optimistic about the future.
I wrote about this a while ago, here:
https://www.lesswrong.com/.../critique-my-model-the-ev-of...
[1] Humanity lasts, but creates vast worlds of suffering. “S-risks”
https://www.facebook.com/ozzie.gooen/posts/10165734005520363