RSS

To­tal ex­is­ten­tial risk

TagLast edit: 17 Jan 2022 13:34 UTC by Pablo

Total existential risk is the cumulative risk of an existential catastrophe.

The concept of total existential risk allows for comparisons of different specific risks in terms of their contribution to the overall risk of catastrophe. This comparison can be made because the particular existential risks are assumed to differ only in their probability and not also in their severity. The assumption is typically warranted since world histories involving existential catastrophes tend to differ in value in minor ways, relative to how each differs from world histories where human potential is fully realized. Permanent civilizational collapse, for instance, may be somewhat better or somewhat worse than human extinction; but both are incalculably worse than a world in which humanity has attained its full potential.[1]

The assumption may fail to hold in special cases, however. First, a hellish existential catastrophe does not only destroy potential value; it also creates disvalue on an astronomical scale. If the catastrophe is as bad as it could possibly be, it would be significantly worse than a non-hellish existential catastrophe.

Second, as Ord notes, some risks may more likely occur in worlds with high potential. A technology that contributes to a risk of this sort would be penalized if assessed by the metric of total existential risk. A straightforward example is artificial intelligence, which increases existential risk from AI alignment but can also bring humanity closer to realizing its potential.[1]

Further reading

Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing, ch. 6.

Related entries

existential risk | hellish existential catastrophe

  1. ^

    Ord, Toby (2020) The Precipice: Existential Risk and the Future of Humanity, London: Bloomsbury Publishing.

Ex­is­ten­tial Risk Model­ling with Con­tin­u­ous-Time Markov Chains

Radical Empath Ismam23 Jan 2023 20:32 UTC
87 points
9 comments12 min readEA link
No comments.