A global catastrophic risk (GCR) is an event that poses a risk of major harm on a global scale.[1]
GCRs risks include, but are not restricted to, existential risks. Examples of non-existential GCRs include risks of hundreds of millions of people dying due to a natural pandemic or due to anthropogenic climate change.
Such catastrophic risks have obviously bad direct effects: they may involve many people dying, or our technological capabilities being greatly reduced. There may also be bad indirect effects, for instance by destabilizing political systems in a way that increases the likelihood of war or totalitarian government.
Some GCRs which are not themselves existential risks could still increase existential risk via their indirect effects. Such GCRs may be regarded as existential risk factors, or as components of a compound existential risk. Arguably, climate change might increase political tensions, hastening nuclear or biological warfare. Alternatively, civilization could eventually rebound to something like its previous state. The Black Death—the deadliest catastrophe in human history—killed something like 10% of the world’s population without obviously affecting humanity’s long-term potential.[2]
Even if global catastrophic risks do not pose an existential risk, they might still be high priority causes justified purely by their nearer-term consequences.
Further reading
Aird, Michael (2020) Collection of some definitions of global catastrophic risks (GCRs), Effective Altruism Forum, February 28.
Many additional resources on this topic.
Avin, Shahar et al. (2018) Classifying global catastrophic risks, Futures, vol. 102, pp. 20–26.
Bostrom, Nick & Milan Ćirković (eds.) (2008) Global Catastrophic Risks, Oxford: Oxford University Press.
Cotton-Barratt, Owen et al. (2016) Global catastrophic risks 2016, Global Priorities Project.
A report examining various types of global catastrophic risk.
Koehler, Arden & Keiran Harris (2020) Owen Cotton-Barratt on epistemic systems & layers of defence against potential global catastrophes, 80,000 Hours, December 16.
Open Philanthropy (2016) Global catastrophic risks, Open Philanthropy, March 2.
Related entries
civilizational collapse | existential risk | existential risk factor | global catastrophic biological risk | Nuclear Threat Initiative | warning shot
- ^
Bostrom, Nick & Milan Ćirković (eds.) (2008) Global Catastrophic Risks, Oxford: Oxford University Press.
- ^
Muehlhauser, Luke (2017) How big a deal was the Industrial Revolution?, Luke Muehlhauser’s Website.
This tag has three posts, and “Existential Risk” has over 300, which is weird given that the latter is a sub-category of the former.
Should CEA apply the GCR tag to all posts with the X-risk tag?
Afterwards, someone should go through all the X-risk tag posts and remove the tag if they are only about GCRs, which I imagine will be the case for many.
Yes, that makes sense.
I think it would be good to make this entry incorporate some points from here and here. I might do that later, but just wanted to note the idea for now.
It’s also possible that it’d be better if someone else did that, since the latter post is by me, and I’m not sure I should edit entries to incorporate ideas/framings from my own posts?