Some global catastrophic risk estimates

In October of 2018, I developed a question series on Metaculus related to extinction events spanning risks from nuclear war, bio-risk, risks from climate change and geo-engineering, Artificial Intelligence risk, and risks from nanotechnology failure modes. Since then, these questions have accrued over 3,000 predictions (ETA: as of today, there the number is around 5,000).

Catastrophes were defined as a reduction in the human population of at least 10% in any period of 5 years or less. (Near) extinction is defined as an event that reduces the human population by at least 10% within 5 years, and by at least 95% within 25 years.

Here’s a summary of the results as they stand today (September 24, 2023), ordered by risk of near extinction:

Global catastrophic riskChance of catastrophe by 2100Chance of (near) extinction by 2100
Artificial Intelligence6.16%3.39%
Other risks1.52%0.13%
Biotechnology or bioengineered pathogens1.52%0.07%
Nuclear war2.86%0.06%
Nanotechnology 0.02%0.01%
Climate change or geo-engineering0.00%0.00%
Natural pandemics0.62%N/​A

These predictions are generated by aggregating forecasters’ individual predictions based on their track records. Specifically, the predictions are weighted by a function of the forecasters’ level of ‘skill’, where ‘skill’ is estimated with data on relative performance on a number (typically many hundreds) of resolved forecasts.

If we assume that these events are independent, the predictions suggest that there’s at a ~17% chance of catastrophe, and a ~1.9% chance of (near) extinction by the end of the century. Admittedly, independence is likely to be an inappropriate assumption, since, for example, some catastrophes could exacerbate other global catastrophic risks.[1]

Interestingly, the predictions indicate that although nuclear risk and bioengineered pathogens are most likely to result in a major catastrophe, an AI failure mode is by far the biggest source of extinction-level risk—it is at least 5-times more likely to cause near extinction than all other risks combined.

Links to all the questions on which these predictions are based may be found here.

For reference, these were the estimates when I first posted this (19 Jun 2022):

Global catastrophic riskChance of catastrophe by 2100Chance of (near) extinction by 2100
Artificial Intelligence3.06%1.56%
Other risks1.36%0.11%
Biotechnology or bioengineered pathogens2.21%0.07%
Nuclear war1.87%0.06%
Nanotechnology 0.17%0.06%
Climate change or geo-engineering0.51%0.01%
Natural pandemics0.51%n/​a
  1. ^

    I explore the assumptions made in order to compute these probabilities in more depth here.