Summary of “The Precipice” (3 of 4): Playing Russian roulette with the future

Link post

This post is the third part of my summary of The Precipice, by Toby Ord. Previous posts explored the various sources of existential risks and how to estimate the dangers. This post ties everything together with an overview of the risks we face. The final post will explore our place in the story of humanity and the importance of reducing existential risk.

To communicate his impression of the risks accurately, Ord puts numbers on them. These numbers represent his best guesses about the order of magnitude of each risk, based on the research behind his book. They do not represent highly certain estimates of the risks, and new information could easily change them.[1]

These estimates indicate risks from anthropogenic sources (such as nuclear war) tend to be much more dangerous than risks from natural sources (such as asteroids). In fact, nuclear war, climate change, and environmental damage are each at least as dangerous as all natural risks combined. And together, they pose a risk 1,000 times greater than all natural sources combined.

Even within anthropogenic risks, some technologies pose greater risks than others. Engineered pandemics, unaligned artificial intelligence, as well as uncategorised and unforeseen anthropogenic risks each present around 100 times the risk from any other source.[2]

Ord’s estimate of the total extinction risk is one in six this century. This may sound pessimistic, but it implies that we have a five in six chance of surviving this century.

It is like we are playing Russian roulette with enormous stakes.

Ord is optimistic, and his estimate assumes that we will recognise the importance of reducing existential risks and take significant steps to reduce them. If we shut our eyes and maintain business as usual, Ord believes we face risks about twice as high — like playing Russian roulette with two bullets in the chamber. But if we get our act together, we could remove both bullets and safeguard humanity.

The next post in this series will explore why existential risks are so important to prevent, our place in the story of humanity, and why existential risks remain neglected today.

Image of the earth from: www.tobyord.com/​​earth

  1. ^

    A sceptic might believe that Ord’s estimates are too high. For instance, they might calculate the risk from misaligned artificial intelligence to be 1 in 100 this century. Surprisingly, these two views would be close in the sense that only a small amount of scientific evidence would be enough to change one position to the other. They may also be close in terms of their practical implications: even if the risk were 1 in 1,000 this century, this would warrant serious global attention.

  2. ^

    Other anthropogenic risks we face include the possibility of atomically precise manufacturing democratising the manufacturing of dangerous weapons, the possibility of contaminating Earth with microbes from other planets when we bring back soil samples, and radical science experiments that create truly unprecedented conditions. These risks are all particularly speculative, but even if one believes that several of them pose no risk, they do suggest that emerging technologies will bring novel dangers.

No comments.