RSS

An­thro­pogenic ex­is­ten­tial risk

TagLast edit: 12 Jul 2022 0:15 UTC by Pablo

An anthropogenic existential risk is an existential risk arising from intentional or accidental human activity rather than underlying natural processes.

New technologies have played a huge part in the massive growth in human flourishing over the past centuries. However, they also pose some serious risks. Nuclear weapons, for example, may have created the potential for wars that result in human extinction. Other technologies may pose similar risks in the future, such as synthetic biology (see global catastrophic biological risk) and artificial intelligence, as well as risks from fundamental physics research and unknown risks.

That our species has so far survived both natural and anthropogenic risks puts an upper bound on how high these risks can be. But humanity has been exposed to natural risks throughout the entirety of its history, whereas anthropogenic risks have emerged only in the last century. This difference between these two types of risks implies that their respective upper bounds are also very different. Specifically, this consideration is generally believed to warrant the conclusion that anthropogenic risks are significantly higher than natural risks.[1][2][3] According to Toby Ord, “we face about a thousand times more anthropogenic risk over the next century than natural risk.”[4]

Further reading

Beckstead, Nick et al. (2014) Unprecedented technological risks, Global Priorities Project.

Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing.

Related entries

differential progress | natural existential risk | vulnerable world hypothesis | weapons of mass destruction

  1. ^

    Bostrom, Nick (2004) The future of human evolution, in Charles Tandy (ed.) Death and Anti-Death: Two Hundred Years after Kant, Fifty Years after Turing, vol. 2, Palo Alto, California: Ria University Press, pp. 339–371.

  2. ^

    Snyder-Beattie, Andrew, Toby Ord & Michael B. Bonsall (2019) An upper bound for the background rate of human extinction, Scientific Reports, vol. 9, pp. 1–9.

  3. ^

    Aschenbrenner, Leopold (2020) Securing posterity, Works in Progress, October 19.

  4. ^

    Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing, p. 87.

A The­olo­gian’s Re­sponse to An­thro­pogenic Ex­is­ten­tial Risk

Fr Peter Wyg3 Nov 2022 4:37 UTC
108 points
17 comments11 min readEA link

Carl Shul­man on the com­mon-sense case for ex­is­ten­tial risk work and its prac­ti­cal implications

80000_Hours8 Oct 2021 13:43 UTC
41 points
2 comments149 min readEA link

A toy model for tech­nolog­i­cal ex­is­ten­tial risk

RobertHarling28 Nov 2020 11:55 UTC
10 points
2 comments4 min readEA link

[Question] What are novel ma­jor in­sights from longter­mist macros­trat­egy or global pri­ori­ties re­search found since 2015?

Max_Daniel13 Aug 2020 9:15 UTC
88 points
56 comments1 min readEA link

Su­per-ex­po­nen­tial growth im­plies that ac­cel­er­at­ing growth is unim­por­tant in the long run

kbog11 Aug 2020 7:20 UTC
36 points
9 comments4 min readEA link

Some thoughts on Toby Ord’s ex­is­ten­tial risk estimates

MichaelA🔸7 Apr 2020 2:19 UTC
67 points
33 comments9 min readEA link

AI Alter­na­tive Fu­tures: Ex­plo­ra­tory Sce­nario Map­ping for Ar­tifi­cial In­tel­li­gence Risk—Re­quest for Par­ti­ci­pa­tion [Linkpost]

Kiliank9 May 2022 19:53 UTC
17 points
2 comments8 min readEA link

Cli­mate-con­tin­gent Fi­nance, and A Gen­er­al­ized Mechanism for X-Risk Re­duc­tion Financing

johnjnay26 Sep 2022 13:23 UTC
6 points
1 comment25 min readEA link

Sum­mary of “The Precipice” (2 of 4): We are a dan­ger to ourselves

rileyharris13 Aug 2023 23:53 UTC
5 points
0 comments8 min readEA link
(www.millionyearview.com)
No comments.