RSS

Ex­is­ten­tial risk factor

TagLast edit: Sep 16, 2022, 11:28 AM by Pablo

An existential risk factor is a factor that increases the probability of an existential catastrophe. Conversely, an existential security factor is a factor that decreases the probability of such a catastrophe.[1] Analogous concepts have been used to analyze risks of human extinction[2] and s-risks.[3]

Further reading

Baumann, Tobias (2019) Risk factors for s-risks, Center for Reducing Suffering, February 13.

Cotton-Barratt, Owen, Max Daniel & Anders Sandberg (2020) Defence in depth against human extinction: prevention, response, resilience, and why they all matter, Global Policy, vol. 11, pp. 271–282.

Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing, pp. 175–180.

Related entries

broad vs. narrow interventions | civilizational collapse | compound existential risk | emergency response | existential catastrophe | existential risk | indirect long-term effects

  1. ^

    Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing, p. 179.

  2. ^

    Cotton-Barratt, Owen, Max Daniel & Anders Sandberg (2020) Defence in depth against human extinction: prevention, response, resilience, and why they all matter, Global Policy, vol. 11, pp. 271–282.

  3. ^

    Baumann, Tobias (2019) Risk factors for s-risks, Center for Reducing Suffering, February 13.

Model­ling Great Power con­flict as an ex­is­ten­tial risk factor

Stephen ClareFeb 3, 2022, 11:41 AM
122 points
22 comments19 min readEA link

Risk fac­tors for s-risks

Tobias_BaumannFeb 13, 2019, 5:51 PM
40 points
3 comments1 min readEA link
(s-risks.org)

AI Gover­nance: Op­por­tu­nity and The­ory of Impact

Allan DafoeSep 17, 2020, 6:30 AM
262 points
19 comments12 min readEA link

On the as­sess­ment of vol­canic erup­tions as global catas­trophic or ex­is­ten­tial risks

Mike CassidyOct 13, 2021, 2:32 PM
112 points
18 comments19 min readEA link

Beyond Sim­ple Ex­is­ten­tial Risk: Sur­vival in a Com­plex In­ter­con­nected World

Gideon FutermanNov 21, 2022, 2:35 PM
84 points
67 comments21 min readEA link

What can we learn from a short pre­view of a su­per-erup­tion and what are some tractable ways of miti­gat­ing it

Mike CassidyFeb 3, 2022, 11:26 AM
53 points
0 comments6 min readEA link

Cru­cial ques­tions for longtermists

MichaelA🔸Jul 29, 2020, 9:39 AM
104 points
17 comments19 min readEA link

8 pos­si­ble high-level goals for work on nu­clear risk

MichaelA🔸Mar 29, 2022, 6:30 AM
46 points
4 comments16 min readEA link

Brian Tse: Risks from Great Power Conflicts

EA GlobalMar 11, 2019, 3:02 PM
23 points
2 comments13 min readEA link
(www.youtube.com)

Robert Wiblin: Mak­ing sense of long-term in­di­rect effects

EA GlobalAug 6, 2016, 12:40 AM
14 points
0 comments17 min readEA link
(www.youtube.com)

Does cli­mate change de­serve more at­ten­tion within EA?

BenApr 17, 2019, 6:50 AM
152 points
65 comments15 min readEA link

How can we re­duce s-risks?

Tobias_BaumannJan 29, 2021, 3:46 PM
42 points
3 comments1 min readEA link
(centerforreducingsuffering.org)

A New X-Risk Fac­tor: Brain-Com­puter Interfaces

JackAug 10, 2020, 10:24 AM
76 points
12 comments42 min readEA link

Case for emer­gency re­sponse teams

technicalitiesApr 5, 2022, 11:08 AM
249 points
50 comments5 min readEA link

The Precipice: a risky re­view by a non-EA

Fernando Moreno 🔸Aug 8, 2020, 2:40 PM
14 points
1 comment18 min readEA link

Sir Gavin and the green sky

technicalitiesDec 17, 2022, 11:28 PM
50 points
0 comments1 min readEA link

Toby Ord: Fireside chat (2018)

EA GlobalMar 1, 2019, 3:48 PM
20 points
0 comments28 min readEA link
(www.youtube.com)

The most im­por­tant cli­mate change uncertainty

cwaJul 26, 2022, 3:15 PM
144 points
28 comments13 min readEA link

Per­sua­sion Tools: AI takeover with­out AGI or agency?

kokotajlodNov 20, 2020, 4:56 PM
15 points
5 comments10 min readEA link

Well-stud­ied Ex­is­ten­tial Risks with Pre­dic­tive Indicators

Noah ScalesJul 6, 2022, 10:13 PM
4 points
0 comments3 min readEA link

Ex­is­ten­tial Risk Model­ling with Con­tin­u­ous-Time Markov Chains

Radical Empath IsmamJan 23, 2023, 8:32 PM
87 points
9 comments12 min readEA link

Clas­sify­ing sources of AI x-risk

Sam ClarkeAug 8, 2022, 6:18 PM
41 points
4 comments3 min readEA link

[Cross­post]: Huge vol­canic erup­tions: time to pre­pare (Na­ture)

Mike CassidyAug 19, 2022, 12:02 PM
107 points
1 comment1 min readEA link
(www.nature.com)

Op­por­tu­ni­ties that sur­prised us dur­ing our Clearer Think­ing Re­grants program

spencergNov 7, 2022, 1:09 PM
116 points
5 comments9 min readEA link

Mis­cel­la­neous & Meta X-Risk Overview: CERI Sum­mer Re­search Fellowship

Will AldredMar 30, 2022, 2:45 AM
39 points
0 comments3 min readEA link

Nu­clear win­ter scepticism

Vasco Grilo🔸Aug 13, 2023, 10:55 AM
110 points
42 comments10 min readEA link
(www.navalgazing.net)

A Gen­tle In­tro­duc­tion to Risk Frame­works Beyond Forecasting

pending_survivalApr 11, 2024, 9:15 AM
81 points
4 comments27 min readEA link

Nu­clear Risk and Philan­thropic Strat­egy [Founders Pledge]

christian.rJul 25, 2023, 8:22 PM
83 points
15 comments76 min readEA link
(www.founderspledge.com)

Some gov­er­nance re­search ideas to pre­vent malev­olent con­trol over AGI and why this might mat­ter a hell of a lot

Jim BuhlerMay 23, 2023, 1:07 PM
63 points
5 comments16 min readEA link

Assess­ing the Danger­ous­ness of Malev­olent Ac­tors in AGI Gover­nance: A Pre­limi­nary Exploration

Callum HinchcliffeOct 14, 2023, 9:18 PM
28 points
4 comments9 min readEA link

An en­tire cat­e­gory of risks is un­der­val­ued by EA [Sum­mary of pre­vi­ous fo­rum post]

Richard RSep 5, 2022, 3:07 PM
76 points
5 comments5 min readEA link
No comments.