RSS

Ex­is­ten­tial risk factor

TagLast edit: 16 Sep 2022 11:28 UTC by Pablo

An existential risk factor is a factor that increases the probability of an existential catastrophe. Conversely, an existential security factor is a factor that decreases the probability of such a catastrophe.[1] Analogous concepts have been used to analyze risks of human extinction[2] and s-risks.[3]

Further reading

Baumann, Tobias (2019) Risk factors for s-risks, Center for Reducing Suffering, February 13.

Cotton-Barratt, Owen, Max Daniel & Anders Sandberg (2020) Defence in depth against human extinction: prevention, response, resilience, and why they all matter, Global Policy, vol. 11, pp. 271–282.

Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing, pp. 175–180.

Related entries

broad vs. narrow interventions | civilizational collapse | compound existential risk | emergency response | existential catastrophe | existential risk | indirect long-term effects

  1. ^

    Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing, p. 179.

  2. ^

    Cotton-Barratt, Owen, Max Daniel & Anders Sandberg (2020) Defence in depth against human extinction: prevention, response, resilience, and why they all matter, Global Policy, vol. 11, pp. 271–282.

  3. ^

    Baumann, Tobias (2019) Risk factors for s-risks, Center for Reducing Suffering, February 13.

Model­ling Great Power con­flict as an ex­is­ten­tial risk factor

Stephen Clare3 Feb 2022 11:41 UTC
122 points
22 comments19 min readEA link

Risk fac­tors for s-risks

Tobias_Baumann13 Feb 2019 17:51 UTC
40 points
3 comments1 min readEA link
(s-risks.org)

AI Gover­nance: Op­por­tu­nity and The­ory of Impact

Allan Dafoe17 Sep 2020 6:30 UTC
262 points
19 comments12 min readEA link

On the as­sess­ment of vol­canic erup­tions as global catas­trophic or ex­is­ten­tial risks

Mike Cassidy13 Oct 2021 14:32 UTC
112 points
18 comments19 min readEA link

Beyond Sim­ple Ex­is­ten­tial Risk: Sur­vival in a Com­plex In­ter­con­nected World

Gideon Futerman21 Nov 2022 14:35 UTC
84 points
67 comments21 min readEA link

What can we learn from a short pre­view of a su­per-erup­tion and what are some tractable ways of miti­gat­ing it

Mike Cassidy3 Feb 2022 11:26 UTC
53 points
0 comments6 min readEA link

Cru­cial ques­tions for longtermists

MichaelA🔸29 Jul 2020 9:39 UTC
104 points
17 comments19 min readEA link

8 pos­si­ble high-level goals for work on nu­clear risk

MichaelA🔸29 Mar 2022 6:30 UTC
46 points
4 comments16 min readEA link

Brian Tse: Risks from Great Power Conflicts

EA Global11 Mar 2019 15:02 UTC
23 points
2 comments13 min readEA link
(www.youtube.com)

Robert Wiblin: Mak­ing sense of long-term in­di­rect effects

EA Global6 Aug 2016 0:40 UTC
14 points
0 comments17 min readEA link
(www.youtube.com)

Does cli­mate change de­serve more at­ten­tion within EA?

Ben17 Apr 2019 6:50 UTC
151 points
65 comments15 min readEA link

How can we re­duce s-risks?

Tobias_Baumann29 Jan 2021 15:46 UTC
42 points
3 comments1 min readEA link
(centerforreducingsuffering.org)

A New X-Risk Fac­tor: Brain-Com­puter Interfaces

Jack10 Aug 2020 10:24 UTC
76 points
12 comments42 min readEA link

Case for emer­gency re­sponse teams

Gavin5 Apr 2022 11:08 UTC
247 points
50 comments5 min readEA link

The Precipice: a risky re­view by a non-EA

Fernando Moreno 🔸8 Aug 2020 14:40 UTC
14 points
1 comment18 min readEA link

Sir Gavin and the green sky

Gavin17 Dec 2022 23:28 UTC
50 points
0 comments1 min readEA link

Toby Ord: Fireside chat (2018)

EA Global1 Mar 2019 15:48 UTC
20 points
0 comments28 min readEA link
(www.youtube.com)

The most im­por­tant cli­mate change uncertainty

cwa26 Jul 2022 15:15 UTC
144 points
28 comments13 min readEA link

Per­sua­sion Tools: AI takeover with­out AGI or agency?

kokotajlod20 Nov 2020 16:56 UTC
15 points
5 comments10 min readEA link

Well-stud­ied Ex­is­ten­tial Risks with Pre­dic­tive Indicators

Noah Scales6 Jul 2022 22:13 UTC
4 points
0 comments3 min readEA link

Ex­is­ten­tial Risk Model­ling with Con­tin­u­ous-Time Markov Chains

Radical Empath Ismam23 Jan 2023 20:32 UTC
87 points
9 comments12 min readEA link

Clas­sify­ing sources of AI x-risk

Sam Clarke8 Aug 2022 18:18 UTC
41 points
4 comments3 min readEA link

[Cross­post]: Huge vol­canic erup­tions: time to pre­pare (Na­ture)

Mike Cassidy19 Aug 2022 12:02 UTC
107 points
1 comment1 min readEA link
(www.nature.com)

Op­por­tu­ni­ties that sur­prised us dur­ing our Clearer Think­ing Re­grants program

spencerg7 Nov 2022 13:09 UTC
116 points
5 comments9 min readEA link

Mis­cel­la­neous & Meta X-Risk Overview: CERI Sum­mer Re­search Fellowship

Will Aldred30 Mar 2022 2:45 UTC
39 points
0 comments3 min readEA link

Nu­clear win­ter scepticism

Vasco Grilo🔸13 Aug 2023 10:55 UTC
110 points
42 comments10 min readEA link
(www.navalgazing.net)

A Gen­tle In­tro­duc­tion to Risk Frame­works Beyond Forecasting

pending_survival11 Apr 2024 9:15 UTC
81 points
4 comments27 min readEA link

Nu­clear Risk and Philan­thropic Strat­egy [Founders Pledge]

christian.r25 Jul 2023 20:22 UTC
83 points
15 comments76 min readEA link
(www.founderspledge.com)

Some gov­er­nance re­search ideas to pre­vent malev­olent con­trol over AGI and why this might mat­ter a hell of a lot

Jim Buhler23 May 2023 13:07 UTC
63 points
5 comments16 min readEA link

Assess­ing the Danger­ous­ness of Malev­olent Ac­tors in AGI Gover­nance: A Pre­limi­nary Exploration

Callum Hinchcliffe14 Oct 2023 21:18 UTC
28 points
4 comments9 min readEA link

An en­tire cat­e­gory of risks is un­der­val­ued by EA [Sum­mary of pre­vi­ous fo­rum post]

Richard R5 Sep 2022 15:07 UTC
76 points
5 comments5 min readEA link
No comments.