An­thro­pogenic ex­is­ten­tial risk

TagLast edit: 27 May 2021 15:11 UTC by EA Wiki assistant

An anthropogenic existential risk is an existential risk arising from intentional or accidental human activity rather than underlying natural processes.

New technologies have played a huge part in the massive growth in human flourishing over the past centuries. However, they also pose some serious risks. Nuclear weapons, for example, may have created the potential for wars that result in human extinction. Other technologies may pose similar risks in the future, such as synthetic biology (see global catastrophic biological risk) and artificial intelligence, as well as risks from fundamental physics research and unknown risks.

That our species has so far survived both natural and anthropogenic risks puts an upper bound on how high these risks can be. But humanity has been exposed to natural risks throughout the entirety of its history, whereas anthropogenic risks have emerged only in the last century. This difference between these two types of risks implies that their respective upper bounds are also very different. Specifically, this consideration is generally believed to warrant the conclusion that anthropogenic risks are significantly higher than natural risks (Bostrom 2004; Snyder-Beattie, Ord & Bonsall 2019; Aschenbrenner 2020). According to Toby Ord, “we face about a thousand times more anthropogenic risk over the next century than natural risk.” (Ord 2020: 87)


Aschenbrenner, Leopold (2020) Securing posterity, Works in Progress, October 19.

Beckstead, Nick et al. (2014) Unprecedented technological risks, Global Priorities Institute/​Future of Humanity Institute/​Oxford Martin Programme on the Impacts of Future Technology/​Centre for the Study of Existential Risk.

Bostrom, Nick (2004) The future of human evolution, in Charles Tandy (ed.) Death and Anti-Death: Two Hundred Years after Kant, Fifty Years after Turing, vol. 2, Palo Alto, California: Ria University Press, pp. 339–371.

Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing.

Snyder-Beattie, Andrew, Toby Ord & Michael B. Bonsall (2019) An upper bound for the background rate of human extinction, Scientific Reports, vol. 9, pp. 1–9.

Related entries

differential progress | vulnerable world hypothesis | weapons of mass destruction

Su­per-ex­po­nen­tial growth im­plies that ac­cel­er­at­ing growth is unim­por­tant in the long run

kbog11 Aug 2020 7:20 UTC
35 points
9 comments4 min readEA link

[Question] What are novel ma­jor in­sights from longter­mist macros­trat­egy or global pri­ori­ties re­search found since 2015?

Max_Daniel13 Aug 2020 9:15 UTC
77 points
55 comments1 min readEA link

Some thoughts on Toby Ord’s ex­is­ten­tial risk estimates

MichaelA7 Apr 2020 2:19 UTC
54 points
31 comments9 min readEA link

A toy model for tech­nolog­i­cal ex­is­ten­tial risk

RobertHarling28 Nov 2020 11:55 UTC
10 points
3 comments4 min readEA link
No comments.