Can a terrorist attack cause human extinction? Not on priors

Summary

  • Historical terrorist attack deaths suggest the probability of a terrorist attack causing human extinction is astronomically low, 4.35*10^-15 per year according to my preferred estimate.

  • One may well update to a much higher extinction risk after accounting for inside view factors. However, extraordinary evidence would be required to move up sufficiently many orders of magnitude for an AI or bio terrorist attack to have a decent chance of causing human extinction.

  • I think it would be great if Open Philanthropy published the results of their efforts to quantify biorisk if/​once they are available.

  • In the realm of the more anthropogenic AI, bio and nuclearrisk, I personally think underweighting the outside view is a major reason leading to overly high risk. I encourage readers to check David’s series exaggerating the risks, which includes subseries on climate, AI and bio risk.

Introduction

I listened to Kevin Esvelt’s stories on The 80,000 Hours Podcast about how bioterrorism may kill billions. They sounded plausible, but at the same time I thought I should account for the outside view, which I knew did not favour terrorism being a global catastrophic risk. So I decided to look into historical terrorist attack deaths to estimate a prior probability for a terrorist attack causing human extinction.

Methods

Firstly, I retrieved data for the deaths in the 202 k terrorist attacks between January 1970 and June 2021 on the Global Terrorism Database (GTB). Then I computed some historical stats for such deaths.

Secondly, I relied on the Python library fitter to find the distributions which best fit to the top 1 %:

  • Terrorist attack deaths.

  • Logarithm of the terrorist attack deaths as a fraction of the global population[1].

I fitted the distributions to the 1 % most deadly attacks because I am interested in the right tail, which may decay faster than suggested by the less extreme points. fitter tries all the types of distributions in SciPy, 111 on 13 November 2023. For each type of distribution, the best fit is that with the lowest residual sum of squares (RSS), respecting the sum of the squared differences between the actual and predicted probability density function (PDF). I set the number of bins to define the PDF to the square root of the number of data points, and left the maximum time to find the best fit parameters to the default value in fitter of 30 s.

I estimated the probability of the terrorist attack deaths as a fraction of the global population being at least 10^-10, 10^-9, …, and 100 % multiplying:

  • 1 %, which is the probability of the attack being in the right tail.

  • Probability of the deaths as a fraction of the global population being at least 10^-10, 10^-9, …, and 100 % if the attack is in the right tail, which I got using the best fit parameters outputted by fitter.

I obtained the annual probability of terrorist attack extinction from 1 - (1 - “probability of the terrorist attack deaths as a fraction of the global population being at least 100 %”)^“terrorist attacks per year”.

I aggregated probabilities from different best fit distributions using the median. I did not use:

  • The mean because it ignores information from extremely low predictions, and overweights outliers.

  • The geometric mean of odds nor the geometric mean because many probabilities were 0.

The calculations are in this Sheet and this Colab.

Results

The results are in the Sheet.

Historical terrorist attacks stats

Terrorist attacks basic stats

Statistic

Terrorist attack deaths

Terrorist attack deaths as a fraction of the global population

Terrorist attack deaths in a single calendar year

Mean

2.43

3.74*10^-10

9.63 k

Minimum

0

0

173

5th percentile

0

0

414

Median

0

0

7.09 k

95th percentile

10

1.53*10^-9

31.1 k

99th percentile

30

4.74*10^-9

41.8 k

Maximum

1.70 k

2.32*10^-7

44.6 k

Terrorist attacks by severity

Terrorist attack deaths

Terrorist attacks

Terrorist attacks per year

Minimum

Maximum

0

Infinity

202 k

3.92 k

0

1

101 k

1.97 k

1

10

89.5 k

1.74 k

10

100

10.8 k

210

100

1 k

235

4.56

1 k

10 k

4

0.0777

10 k

Infinity

0

0

Tail distribution of the terrorist attack deaths

Terrorist attacks tail risk

Below are the median RSS, coefficient of determination[2] (R^2), probability of terrorist attack deaths as a fraction of the global population being at least 10^-10, 10^-9, …, and 100 %, and annual probability of a terrorist attack causing human extinction. The medians are taken across the best, top 10, and top 100 distributions according to 3 fitness criteria, lowest RSS[3], Akaike information criterion (AIC), and Bayesian information criterion (BIC). My preferred estimate is the median taken across the 10 distributions with lowest RSS, which is the default fitness criterion in fitter, fitted to the top 1 % logarithm of the terrorist attack deaths as a fraction of the global population. The results for AIC and BIC are identical[4]. Null values may be exactly 0 if they concern bounded distributions, or just sufficiently small to be rounded to 0 due to finite precision. I also show the annual probability of terrorist attack extinction as a function of R^2.

Best fit to the top 1 % terrorist attack deaths

Distributions

Median RSS

Median R^2

Best (RSS)

1.87*10^-6

99.6 %

Top 10 (RSS)

9.26*10^-6

98.0 %

Top 100 (RSS)

1.03*10^-4

78.1 %

Best (AIC)

4.71*10^-4

0

Top 10 (AIC)

4.65*10^-4

1.21 %

Top 100 (AIC)

8.77*10^-5

81.4 %

Best (BIC)

4.71*10^-4

0

Top 10 (BIC)

4.65*10^-4

1.21 %

Top 100 (BIC)

8.77*10^-5

81.4 %

Distributions

Median probability of terrorist attack deaths as a fraction of the global population being at least...

10^-10

10^-9

10^-8

10^-7

Best (RSS)

1.00 %

9.94*10^-3

9.89*10^-4

2.15*10^-6

Top 10 (RSS)

1.00 %

1.00 %

1.43*10^-3

2.21*10^-7

Top 100 (RSS)

1.00 %

1.00 %

2.10*10^-3

9.00*10^-7

Best (AIC)

1.00 %

1.00 %

9.71*10^-3

5.44*10^-3

Top 10 (AIC)

9.95*10^-3

9.94*10^-3

9.18*10^-3

5.20*10^-3

Top 100 (AIC)

1.00 %

1.00 %

2.10*10^-3

1.52*10^-6

Best (BIC)

1.00 %

1.00 %

9.71*10^-3

5.44*10^-3

Top 10 (BIC)

9.95*10^-3

9.94*10^-3

9.18*10^-3

5.20*10^-3

Top 100 (BIC)

1.00 %

1.00 %

2.10*10^-3

1.52*10^-6

Distributions

Median probability of terrorist attack deaths as a fraction of the global population being at least...

10^-6

0.001 %

0.01 %

0.1 %

Best (RSS)

4.64*10^-9

1.00*10^-11

2.17*10^-14

4.66*10^-17

Top 10 (RSS)

3.60*10^-12

3.05*10^-17

0

0

Top 100 (RSS)

0

0

0

0

Best (AIC)

0

0

0

0

Top 10 (AIC)

0

0

0

0

Top 100 (AIC)

0

0

0

0

Best (BIC)

0

0

0

0

Top 10 (BIC)

0

0

0

0

Top 100 (BIC)

0

0

0

0

Distributions

Median probability of terrorist attack deaths as a fraction of the global population being at least...

Median annual probability of terrorist attack extinction

1 %

10 %

100 %

Best (RSS)

0

0

0

0

Top 10 (RSS)

0

0

0

0

Top 100 (RSS)

0

0

0

0

Best (AIC)

0

0

0

0

Top 10 (AIC)

0

0

0

0

Top 100 (AIC)

0

0

0

0

Best (BIC)

0

0

0

0

Top 10 (BIC)

0

0

0

0

Top 100 (BIC)

0

0

0

0

Best fit to the top 1 % logarithm of the terrorist attack deaths as a fraction of the global population

Distributions

Median RSS

Median R^2

Best (RSS)

0.113

98.4 %

Top 10 (RSS)

0.123

98.2 %

Top 100 (RSS)

0.654

90.5 %

Best (AIC)

6.88

0

Top 10 (AIC)

6.87

1.77*10^-3

Top 100 (AIC)

0.654

90.5 %

Best (BIC)

6.88

0

Top 10 (BIC)

6.87

1.77*10^-3

Top 100 (BIC)

0.654

90.5 %

Distributions

Median probability of terrorist attack deaths as a fraction of the global population being at least...

10^-10

10^-9

10^-8

10^-7

Best (RSS)

1.00 %

1.00 %

2.75*10^-3

3.48*10^-5

Top 10 (RSS)

1.00 %

1.00 %

2.74*10^-3

3.65*10^-5

Top 100 (RSS)

1.00 %

1.00 %

2.89*10^-3

6.85*10^-5

Best (AIC)

1.00 %

1.00 %

8.08*10^-3

2.16*10^-3

Top 10 (AIC)

1.00 %

1.00 %

7.18*10^-3

2.40*10^-3

Top 100 (AIC)

1.00 %

1.00 %

2.89*10^-3

6.81*10^-5

Best (BIC)

1.00 %

1.00 %

8.08*10^-3

2.16*10^-3

Top 10 (BIC)

1.00 %

1.00 %

7.18*10^-3

2.40*10^-3

Top 100 (BIC)

1.00 %

1.00 %

2.89*10^-3

6.81*10^-5

Distributions

Median probability of terrorist attack deaths as a fraction of the global population being at least...

10^-6

0.001 %

0.01 %

0.1 %

Best (RSS)

4.21*10^-7

5.09*10^-9

6.15*10^-11

7.44*10^-13

Top 10 (RSS)

4.46*10^-7

5.26*10^-9

6.05*10^-11

6.85*10^-13

Top 100 (RSS)

4.19*10^-8

5.65*10^-11

2.34*10^-14

4.11*10^-17

Best (AIC)

0

0

0

0

Top 10 (AIC)

0

0

0

0

Top 100 (AIC)

3.95*10^-8

5.22*10^-11

1.09*10^-14

1.55*10^-17

Best (BIC)

0

0

0

0

Top 10 (BIC)

0

0

0

0

Top 100 (BIC)

3.95*10^-8

5.22*10^-11

1.09*10^-14

1.55*10^-17

Distributions

Median probability of terrorist attack deaths as a fraction of the global population being at least...

Median annual probability of terrorist attack extinction

1 %

10 %

100 %

Best (RSS)

9.00*10^-15

1.09*10^-16

1.11*10^-18

4.35*10^-15

Top 10 (RSS)

7.71*10^-15

8.66*10^-17

1.11*10^-18

4.35*10^-15

Top 100 (RSS)

5.55*10^-19

0

0

0

Best (AIC)

0

0

0

0

Top 10 (AIC)

0

0

0

0

Top 100 (AIC)

0

0

0

0

Best (BIC)

0

0

0

0

Top 10 (BIC)

0

0

0

0

Top 100 (BIC)

0

0

0

0

Discussion

Historical terrorist attack deaths suggest the probability of a terrorist attack causing human extinction is astronomically low. According to my preferred estimate, the median taken across the 10 distributions with lowest RSS fitted to the top 1 % logarithm of the terrorist attack deaths as a fraction of the global population, the annual probability of human extinction caused by a terrorist attack is 4.35*10^-15. Across all my 18 median estimates, it is 4.35*10^-15 for 2, and 0 for the other 16. For context, in the Existential Risk Persuasion Tournament (XPT), superforecasters and domain experts[5] predicted a probability of an engineered pathogen causing human extinction by 2100 of 0.01 % and 1 %, which are much higher than the implied by my prior estimates.

I do not think anthropics are confounding the results. There would be no one to do this analysis if a terrorist attack had caused human extinction in the past, but there would be for less severe large scale attacks, and there have not been any. The most deadly attack since 1970 on GTB only killed 1.70 k people. As a side note, the most deadly bioterrorist incident only killed 200 people (over 5 years).

Interestingly, some best fit distributions led to an annual probability of human extinction caused by a terrorist attack higher than 10 % (see last 2 graphs), which is unreasonably high. In terms of the best fit distributions to:

  • The top 1 % terrorist attack deaths, a lognormal results in 45.1 %[6] (R^2 = 14.5 %).

  • The logarithm of the top 1 % terrorist attack deaths as a fraction of the global population (the following are ordered by descending R^2):

Nonetheless, I have no reason to put lots of weight into the distributions above, so I assume it makes sense to rely on the median. In addition, according to extreme value theory, the right tail should follow a generalised Pareto, whose respective best fit to:

  • The top 1 % terrorist attack deaths results in an annual probability of human extinction caused by a terrorist attack of 0 (R^2 of 94.5 %).

  • The logarithm of the top 1 % terrorist attack deaths as a fraction of the global population results in 0 (98.1 %).

Of course, no one is arguing that terrorist attacks with the current capabilities pose a meaningful extinction risk. The claim is that the world may become vulnerable if advances in AI and bio make super destructive technology much more widely accessible. One may well update to a much higher extinction risk after accounting for inside view factors, and indirect effects of terrorist attacks (like precipitating wars). However, extraordinary evidence would be required to move up sufficiently many orders of magnitude for an AI or bio terrorist attack to have a decent chance of causing human extinction. On the other hand, extinction may be caused via other routes, like a big war or accident.

I appreciate it is often difficult to present compelling evidence for high terrorism risk due to infohazards, even if these hinder good epistemics. However, there are ways of quantifying terrorism risk which mitigate them. From Open Philanthropy’s request for proposal to quantify biorisk:

One way of estimating biological risk that we do not recommend is ‘threat assessment’—investigating various ways that one could cause a biological catastrophe. This approach may be valuable in certain situations, but the information hazards involved make it inherently risky. In our view, the harms outweigh the benefits in most cases.

A second, less risky approach is to abstract away most biological details and instead consider general ‘base rates’. The aim is to estimate the likelihood of a biological attack or accident using historical data and base rates of analogous scenarios, and of risk factors such as warfare or terrorism.

As far as know, there is currently no publicly available detailed model of tail biorisk. I think it would be great if Open Philanthropy published the results of their efforts to quantify biorisk if/​once they are available. The request was published about 19 months ago, and the respective grants were announced 17 months ago. Denise Melchin asked 12 months ago whether the results would be published, but there has been no reply.

In general, I agree with David Thorstad that Toby Ord’s guesses for the existential risk between 2021 and 2120 given in The Precipice are very high. In the realm of the more anthropogenic AI, bio and nuclearrisk, I personally think underweighting the outside view is a major reason leading to overly high risk. I encourage readers to check David’s series exaggerating the risks, which includes subseries on climate, AI and bio risk.

Acknowledgements

Thanks to Anonymous Person 1 and Anonymous Person 2 for feedback on the draft.

  1. ^

    fitter has trouble finding good distributions if I do not use the logarithm, presumably because the optimisation is harder /​ fails with smaller numbers.

  2. ^

    Fraction of the variance of the actual PDF explained by the predicted PDF. I computed this fraction from 1 minus the ratio between the residual and total sum of squares.

  3. ^

    Equivalent to lowest R^2.

  4. ^

    Although the results I present are identical, the rankings for AIC and BIC are not exactly the same.

  5. ^

    The sample drew heavily from the Effective Altruism (EA) community: about 42% of experts and 9% of superforecasters reported that they had attended an EA meetup.

  6. ^

    In addition, the best fit normal-inverse Gaussian to the top 1 % terrorist attack deaths results in 100 % (R^2 = 86.0 %). I did not mention this in the main text because there must be an error in fitter (or, less likely, in SciPy’s norminvgauss) for this case. The respective cumulative distribution function (CDF) increases until terrorist attack deaths as a fraction of the global population of 10^-7, but then unexpectedly drops to 0, which cannot be right.