An existential risk is a risk that threatens the destruction of the long-term potential of life.[1] An existential risk could threaten the extinction of humans (and other sentient beings), or it could threaten some other unrecoverable collapse or permanent failure to achieve a potential good state. Natural risks such as those posed by asteroids or supervolcanoes could be existential risks, as could anthropogenic (human-caused) risks like accidents from synthetic biology or unaligned artificial intelligence.
Estimating the probability of existential risk from different factors is difficult, but there are some estimates.[1]
Some view reducing existential risks as a key moral priority, for a variety of reasons.[2] Some people simply view the current estimates of existential risk as unacceptably high. Other authors argue that existential risks are especially important because the long-run future of humanity matters a great deal.[3] Many believe that there is no intrinsic moral difference between the importance of a life today and one in a hundred years. However, there may be many more people in the future than there are now. Given these assumptions, existential risks threaten not only the beings alive right now, but also the enormous number of lives yet to be lived. One objection to this argument is that people have a special responsibility to other people currently alive that they do not have to people who have not yet been born.[4] Another objection is that, although it would in principle be important to manage, the risks are currently so unlikely and poorly understood that existential risk reduction is less cost-effective than work on other promising areas.
In The Precipice: Existential Risk and the Future of Humanity, Toby Ord offers several policy and research recommendations for handling existential risks:[5]
Explore options for new international institutions aimed at reducing existential risk, both incremental and revolutionary.
Investigate possibilities for making the deliberate or reckless imposition of human extinction risk an international crime.
Investigate possibilities for bringing the representation of future generations into national and international democratic institutions.
Each major world power should have an appointed senior government position responsible for registering and responding to existential risks that can be realistically foreseen in the next 20 years.
Find the major existential risk factors and security factors—both in terms of absolute size and in the cost-effectiveness of marginal changes.
Target efforts at reducing the likelihood of military conflicts between the US, Russia and China.
Improve horizon-scanning for unforeseen and emerging risks.
Investigate food substitutes in case of extreme and lasting reduction in the world’s ability to supply food.
Develop better theoretical and practical tools for assessing risks with extremely high stakes that are either unprecedented or thought to have extremely low probability.
Improve our understanding of the chance civilization will recover after a global collapse, what might prevent this, and how to improve the odds.
Develop our thinking about grand strategy for humanity.
Develop our understanding of the ethics of existential risk and valuing the longterm future.
Further reading
Bostrom, Nick (2002) Existential risks: analyzing human extinction scenarios and related hazards, Journal of Evolution and Technology, vol. 9.
A paper surveying a wide range of non-extinction existential risks.
Bostrom, Nick (2013) Existential risk prevention as global priority, Global Policy, vol. 4, pp. 15–31.
Matheny, Jason Gaverick (2007) Reducing the risk of human extinction, Risk Analysis, vol. 27, pp. 1335–1344.
A paper exploring the cost-effectiveness of extinction risk reduction.
Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing.
Ord, Toby (2020) Existential risks to humanity in Pedro Conceição (ed.) The 2020 Human Development Report: The Next Frontier: Human Development and the Anthropocene, New York: United Nations Development Programme, pp. 106–111.
Sánchez, Sebastián (2022) Timeline of existential risk, Timelines Wiki.
Related entries
civilizational collapse | criticism of longtermism and existential risk studies | dystopia | estimation of existential risks | ethics of existential risk | existential catastrophe | existential risk factor | existential security | global catastrophic risk | hinge of history | longtermism | Toby Ord | rationality community | Russell–Einstein Manifesto | s-risk
- ^
Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing.
- ^
Todd, Benjamin (2017) The case for reducing existential risks, 80,000 Hours website. (Updated June 2022.)
- ^
Beckstead, Nick (2013) On the Overwhelming Importance of Shaping the Far Future, PhD thesis, Rutgers University.
- ^
Roberts, M. A. (2009) The nonidentity problem, Stanford Encyclopedia of Philosophy, July 21 (updated 1 December 2020).
- ^
Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing, pp. 280–281.
Below I consider changes for this Wiki page.
The sentence
is insufficient in my view in capturing what existential risks humanity faces. I believe that having the list of existential risks covered in Bruce E. Tonn’s Anticipation, Sustainability, Futures and Human Extinction on the EAF Existential Risk Wiki would be substantially more helpful to EAF readers than the above sentence.
Some or all of Tonn’s explanations can be replaced or supplemented with updated and/or more comprehensive information. If those on this forum studied in existential risk choose to do away with most of Tonn’s descriptions I still believe that whatever is left of risk framework below would still be a useful development for this Wiki page.
Here is the list of existential risks in Tonn’s book, without their explanations:
Of course, to understand some of these risk classifications adequately the context provided by Tonn in the book is needed. One of Tonn’s explanations for the first risk category captures this idea:
So, many of these existential risks might better be classified as extreme risk or GCRs, or as events that greatly increase the chance of something else resulting in extinction (a chain event) shortly after (on a geological time scale). Should Tonn’s listing be incorporated into this Wiki page, I think providing explanations next to each risk and perhaps nest to each risk category as well would be a good approach. If given permission by the community, I would begin by inserting this framework as you see it now and then would (1) link each risk to its Wikipedia page or flagship paper; (2) would provide an explanation for each risk and risk category, sometimes including the same sources as Tonn; and (3) would optimize for brevity in doing (1) and (2).
Other than covering the actual existential risks listed on this Wiki page, I think copying some parts of the LessWrong Wiki concept page for Existential Risk (see https://www.lesswrong.com/tag/existential-risk) would be a good idea. The highest priority action I can think of would be including Bostrom’s 2012 classifications of existential risks, which would coincide well with Tonn’s risk framework, in my opinion.
I do not have much more to say for now regarding this Wiki page.
Please share your thoughts on these proposed edits. If people support them, I will make them. If people support them conditional on some further changes, I will make the update the edits and then make them.
Thank you for reading this!
Also, pinging @Pablo given the extent of his contributions to the EAF Wiki pages.