Probability of extinction for various types of catastrophes

Summary

  • I estimated and studied the probability of extinction for various types of catastrophe in the 21st century in this Sheet[1]. The results for the probability of extinction are in the table below.

    • The inputs are predictions from Metaculus’ Ragnarok Series[2], and guesses from me, and provided by Luisa Rodriguez here.

  • Relative to Toby Ord’s best guesses for the existential risk from 2021 to 2120 given in The Precipice, my analysis suggests the relative importance of:

    • Artificial intelligence is similar (rounded to half an order of magnitude).

    • Climate change and geoengineering, synthetic biology, and “other” is half an order of magnitude lower.

    • Nuclear war is one order of magnitude higher.

Type of catastrophe (in the 21st century)Probability of extinction (%)
Any*4.26
Any1.77
Artificial intelligence2.84
Climate change and geoengineering0.0106
Nanotechnology0.245
Nuclear war0.299
Synthetic biology0.220
Other0.695

Acknowledgements

Thanks to David Denkenberger, Eli Lifland, Gregory Lewis, Misha Yagudin, Nuño Sempere, and Tamay Besiroglu.

Methods

I calculated the probability of extinction for catastrophes in the 21st century caused by:

  • Artificial intelligence.

  • Climate change and geoengineering.

  • Nanotechnology.

  • Nuclear war.

  • Synthetic biology

  • Other.

  • Any.

  • Any*.

The results for “any” do not explicitly depend on those of the 6 1st types of catastrophe mentioned above, whereas those for “any*” are calculated assuming independence between them.

The inputs to the calculations are:

Concretely, I calculated the probability of extinction from the sum of the following 3 products (see tab “Probability of extinction by catastrophe” of this Sheet):

  • The population loss being between 0 and 10 %, and the sum of the products between the probability of extinction given a population loss between 0 and 10 % under a certain scenario and its respective probability.

  • The population loss being between 10 % and 95 %, and the sum of the products between the probability of extinction given a population loss between 10 % and 95 % under a certain scenario and its respective probability.

  • The population loss being between 95 % and 1, and the sum of the products between the probability of extinction given a population loss between 95 % and 1 under a certain scenario and its respective probability.

Population loss

I computed the probability of the population loss falling into each of the 3 population loss ranges presented above based on the complementary cumulative distribution function (CCDF) of the population loss (see tab “Probability of the population loss”). I assumed the CCDF decreases linearly between each consecutive pair of the following points[3] (see tab “CCDF of the population loss”):

  • Population loss of 0, and CCDF of 1, i.e. probability 1 of the catastrophe decreasing the population size.

  • Population loss of 10 %, and CCDF given by the product between:

    • The probability of a population loss greater than 10 %.

    • The probability of such population loss being caused by a certain type of catastrophe, given that it occurred.

  • Population loss of 95 %, and CCDF given by the product between:

    • Population loss of 10 % caused by a certain type of catastrophe, which equals the product just above.

    • The probability of a population loss of 95 % being caused by a certain type of catastrophe, given that a population loss of 10 % caused by that type of catastrophe occurred.

  • Population loss of 1, and CCDF of 0, i.e. probability 0 of the population after the catastrophe being negative.

I set the probabilities required to determine the CCDF for the population losses of 10 % and 95 % to Metaculus’ community predictions (collected in tab “Metaculus’ predictions”).

Probability of extinction

I calculated the probability of extinction for each of the 3 population loss ranges presented above for 3 exhaustive scenarios (see tab “Probability of extinction by scenario”):

  • Without major infrastructure damage nor (major) climate change.

  • With major infrastructure damage and (major) climate change.

  • With either major infrastructure damage or (major) climate change.

To illustrate what is intended by “major infrastructure damage” and “major climate change”, Luisa writes:

  • “Major infrastructure damage”: “e.g. damaged roads, destroyed bridges, collapsed buildings, damaged power lines”.

  • “Major climate change”: “e.g. nuclear winter”.

For the 1st and 2nd scenarios, I determined the probability of extinction from its mean value for each of the population loss ranges. For the 3rd one, I computed it from the geometric mean between the values for the 1st and 2nd scenarios.

I supposed the probability of extinction as a function of the population loss to increase linearly between each consecutive pair of the following points:

  • Without major infrastructure damage nor climate change:

    • Population loss of 0, and probability of extinction of 0.

    • Population loss of 50 %, and probability of extinction of PE_1 = (0*0.0001)^0.5 = 0.

    • Population loss of PL = 1 − 10^-5.5 = 99.9997 %, and probability of extinction of 50 %.

    • Population loss of 99.99 %, and probability of extinction of 0.173 %, which I estimated from 1 % of the probability of extinction for the same population loss, but with major infrastructure damage and climate change.

    • Population loss of 1, and probability of extinction of 1.

  • With major infrastructure damage and climate change:

    • Population loss of 0, and probability of extinction of 0.

    • Population loss of 90 %, and probability of extinction of PE_2 = 10^-1.5 = 3.16 %.

    • Population loss of 99.99 %, and probability of extinction of PE_3 = (0.1*0.3)^0.5 = 17.3 %.

    • Population loss of 1, and probability of extinction of 1.

PE_1, PL, PE_2 and PE_3 are the geometric means between the lower and upper bounds of the best guesses provided by Luisa here:

  • For the population loss of 50 % without major infrastructure damage nor climate change (PE_1):

    • “Case 1: I [Luisa] think it’s exceedingly unlikely [probability “< 0.0001”, i.e. between 0 and 0.0001; see 1st table] that humanity would go extinct (within ~a generation) as a direct result of a catastrophe that causes the deaths of 50% of the world’s population, but causes no major infrastructure damage (e.g. damaged roads, destroyed bridges, collapsed buildings, damaged power lines, etc.) or extreme changes in the climate (e.g. cooling)”.

  • For the probability of extinction of 50% without major infrastructure damage nor climate change (PL):

    • “My [Luisa’s] best guess is that the turning point at which extinction goes from under 50% to over 50% is between 99.999% population death (80,000) and 99.9999% (8,000) population death (even before considering additional starting conditions like infrastructure damage or climate change)”.

  • For the population loss of 90 % with major infrastructure damage and climate change (PE_2):

    • “Case 2: I [Luisa] think it’s very unlikely [probability “between 0.01 and 0.1”] that humanity would go extinct as a direct result of a catastrophe that caused the deaths of 90% of the world’s population (leaving 800 million survivors), major infrastructure damage, and severe climate change (e.g. nuclear winter/​asteroid impact)”.

  • For the population loss of 99.99 % with major infrastructure damage and climate change (PE_3):

    • “Case 3: I [Luisa] think it’s fairly unlikely [probability “between 0.1 and 0.3”] that humanity would go extinct as a direct result of a catastrophe that caused the deaths of 99.99% of people (leaving 800 thousand survivors), extensive infrastructure damage, and temporary climate change (e.g. a more severe nuclear winter/​asteroid impact, plus the use of biological weapons)”.

Probability of the scenarios

My guesses for the probability of each of the 3 scenarios defined in the previous section given a population loss caused by a certain type of catastrophe is in the table below (and in tab “Probability of extinction scenarios by catastrophe”). I calculated the probability for the type “other” from the mean of the probability for the other types of catastrophes (excluding “any”), and the one for the type “any” from the mean of the probability of the various types weighted by their probability of leading to a population between 95 % and 1.

Type of catastrophe (in the 21st century)Probability of scenario given a population loss
No major infrastructure damage nor climate changeMajor infrastructure damage and climate changeEither major infrastructure damage or climate change
Any29.2 %22.5 %48.3 %
Artificial intelligence141412
Climate change and geoengineering001
Nanotechnology01323
Nuclear war01323
Synthetic biology100
Other25.0 %18.3 %56.7 %

Results

The tables below contain the results for:

  • The probability of the population loss ranges by type of catastrophe.

  • The probability of extinction by population loss range and scenario.

  • The probability of extinction by population loss and type of catastrophe.

Type of catastrophe (in the 21st century)Probability (%) of a population loss between…
0 to 10 %10 % to 95 %95 % to 1
Any*10025.310.3
Any68.027.84.16
Artificial intelligence90.42.407.20
Climate change and geoengineering98.41.580.0160
Nanotechnology99.00.5380.422
Nuclear war90.49.220.384
Synthetic biology90.48.740.864
Other92.65.671.69
ScenarioProbability of extinction (%) for a population loss between…
0 to 10 %10 % to 95 %95 % to 1
No major infrastructure damage nor climate change00.041325.1
Major infrastructure damage and climate change0.1762.0555.1
Either major infrastructure damage or climate change00.29137.2
Type of catastrophe (in the 21st century)Probability of extinction (%) for a population loss between…
0 to 10 %10 % to 95 %95 % to 10 to 1 (total)
Any*0.1800.1413.954.26
Any0.02690.1711.571.77
Artificial intelligence0.03970.01602.782.84
Climate change and geoengineering00.004610.005950.0106
Nanotechnology0.05800.004710.1820.245
Nuclear war0.05290.08080.1660.299
Synthetic biology00.003610.2170.220
Other0.02980.03120.6340.695

Discussion

Probability of extinction by scenario

The relative importance of major infrastructure damage and climate change decreases as the severity of the population loss increases. The ratio between the probability of extinction without major infrastructure damage nor climate change and the probability of extinction with both is (see cells F3:F5 of tab “Probability of extinction by scenario”):

  • 0 for a population loss between 0 and 10 %.

  • 2.02 % for a population loss between 10 % and 95 %.

  • 45.5 % for a population loss between 95 % and 1.

This tendency seems correct, as the probability of extinction is 1 for a population loss of 1 regardless of infrastructure damage and climate change.

Probability of extinction by type of catastrophe

Comparison of absolute values with the GCRS

In the table below (and in tab “Comparison of absolute values with the GCRS”), I compare the probability of extinction by type of catastrophe in the 21st century I estimated with ones I derived from the 2008 Global Catastrophic Risks Survey (GCRS), whose results are presented in this report by Anders Sandberg and Toby Ord from the Future of Humanity Institute[4] (see tab “2008 Global Catastrophic Risks Survey”). The GCRS estimates refer to the period from 2009 to 2099, but I adjusted them to the period from 2023 to 2100 assuming constant risk. Additionally, I derived GCRS’ estimate for “other” risks assuming independence between the types of catastrophes[5].

Type of catastrophe (in the 21st century)Probability of extinction (%) for a population loss between…
My analysis (%)GCRS (%)Absolute difference to GCRS (pp)Relative difference to GCRS (%)
Any*4.2616.5-12.3-74.2
Any1.7716.5-14.8-89.3
Artificial intelligence2.844.30-1.46-34.0
Nanotechnology0.2454.30-4.06-94.3
Nuclear war0.2990.858-0.558-65.1
Synthetic biology0.2201.72-1.50-87.2
Other0.6956.46-5.76-89.2

My probabilities of extinction are lower than those I derived from the GCRS for all types of catastrophe. Nanotechnology has the largest relative difference, and artificial intelligence the smallest.

The GCRS did not address “climate change and geoengineering”, but my estimate of 0.0106 % is similar to:

  • 10 % of the best guess of 0.1 % mentioned by Toby Ord in The Precipice for the existential risk due to climate change from 2021 to 2120 (see Table 6.1).

  • 10 % of the upper bound of 0.1 %, and 10 times the best guess of 0.001 % mentioned here by John Halstead for the existential risk due to climate change[6].

  • The upper bound of 0.01 % guessed by 80,000 Hours here for the existential risk due to climate change[7].

Comparison of priorities with The Precipice

Ultimately, what is the most relevant for prioritisation is how the various probabilities compare with each other. Having this in mind, in the table below (and in tab “Comparison of priorities with The Precipice”), I present the probability of extinction in the 21st century as a fraction of that for “any*”, and the existential risk between 2021 and 2120 guessed by Toby Ord in The Precipice (see tab “Existential risk estimates from The Precipice”) as a fraction of the total. The existential risk for “other” was estimated from those for “unforeseen anthropogenic risk” and “other anthropogenic risk” assuming independence between them.

Type of catastropheNormalised probability of extinction for a catastrophe in the 21st century (%)Normalised existential risk from 2021 to 2120 (%)RatioDecimal logarithm of the ratio
Artificial intelligence66.660.01.110.0455
Climate change and geoengineering0.2480.6000.413-0.384
Nuclear war7.030.60011.71.07
Synthetic biology5.1720.00.259-0.587
Other16.331.60.516-0.287

Relative to Toby Ord’s best guesses, my analysis suggests the relative importance of:

  • Artificial intelligence is similar (rounded to half an order of magnitude).

  • Climate change and geoengineering, synthetic biology, and “other” is half an order of magnitude lower.

  • Nuclear war is one order of magnitude higher.

The adequacy of this comparison depends on the extent to which probability of extinction is a good proxy for existential risk.

Quality of the inputs

In essence, the results I obtained are a function of guesses from Metaculus’ forecasters, Luisa Rodriguez, and me. I should note there is margin to improve the quality of the inputs:

  • Regarding Metaculus:

    • Eli Lifland, Gregory Lewis, Misha Yagudin, and Nuño Sempere from Samotsvety Forecasting (and presumably other superforecasters) expressed concerns about relying on Metaculus’ community predictions[8].

    • I also noticed these are internally inconsistent:

      • The probability of a population loss greater than 95 % due to “any” catastrophe is lower than that due to artificial intelligence (4.16 % < 7.20 %; see cells B5:C5 of tab “CCDF of the population loss”).

      • This leads to the probability of extinction due to “any” being lower than that due to artificial intelligence (1.77 % < 2.84 %; see cells D6:E6 of tab “Probability of extinction by catastrophe”).

    • However, I do not know about other forecasts looking into population losses by catastrophe such as Metaculus’ Ragnarok series.

  • Luisa’s analysis is great, but “a first step toward understanding this threat from civilizational collapse — not a final or decisive one”.

  • I am not a forecaster, and merely based my guesses on my previous knowledge.

That being said, for the reasons outlined by Scott Alexander here, I believe establishing priorities based on a quantitative model with guessed inputs is often better than guessing priorities.

  1. ^

    To clarify, the probability refers to catastrophes occurring during the 21st century, but the extinction may happen afterwards.

  2. ^

    The results in the Sheet are updated automatically as the Metaculus’ predictions change.

  3. ^

    This implies the probability density function (PDF) of the population loss is uniform for each of the 3 ranges.

  4. ^

    More existential risk estimates are available in this database, which was introduced by Michael Aird here.

  5. ^

    This implies the GCRS’ estimates for “any*” are the same as for “any”.

  6. ^

    “With those caveats in my mind, my best guess estimate is that the indirect risk of existential catastrophe due to climate change is on the order of 1 in 100,000, and I struggle to get the risk above 1 in 1,000. Working directly on US-China, US-Russia, India-China, or India-Pakistan relations seems like a better way to reduce the risk of Great Power War than working on climate change”. I guess John’s best guess for the total risk of existential catastrophe due to climate change is similar to John’s best guess for the indirect risk, which equals John’s upper bound for the direct risk: “I [John] construct several models of the direct extinction risk from climate change but struggle to get the risk above 1 in 100,000 over all time”.

  7. ^

    “That said, we [80,000 Hours] still think this risk is relatively low. If climate change poses something like a 1 in 1,000,000 risk of extinction by itself, our guess is that its contribution to other existential risks is at most a few orders of magnitude higher — so something like 1 in 10,000”.

  8. ^

    Metaculus’ predictions are at the bottom of Eli’s personal tier list for how much weight to give to AI existential risk forecasts (see this footnote for details).

No comments.