From the wiki: “An existential risk is the risk of an existential catastrophe, i.e. one that threatens the destruction of humanity’s longterm potential.” That can include getting permanently locked into a totalitarian dictatorship and things of that sort, even if they don’t result in extinction.
From the wiki: “An existential risk is the risk of an existential catastrophe, i.e. one that threatens the destruction of humanity’s longterm potential.” That can include getting permanently locked into a totalitarian dictatorship and things of that sort, even if they don’t result in extinction.
Thank you! And doubly thank you for the topic link. In case others are confused, I found the end of this post particularly clear https://forum.effectivealtruism.org/posts/qFdifovCmckujxEsq/existential-risk-is-badly-named-and-leads-to-narrow-focus-on