I think x-risk should remain as meaning extinction risk, but “doom” better encompasses what people refer to when they use x-risk colloquially. Doom encompasses x-risk (extinction risk), any permanent drastic curtailment of potential (permanent stagnation or totalitarian lock-in), and s-risk (suffering risk; fates worse than extinction).
EDIT: I fell prey to the original confusion. x-risk was originally existential risk, which encompasses extinction risk and permanent drastic curtailment of future potential. Perhaps x-risk is better as just extinction risk, and ex-risk should be used for existential risk? Doom is still useful as a broader term for common usage (also including s-risk).
Would be good if OP included a definition of existential risk/x-risk!
I really like this! doom feels right for existential risks and x risk definitely could be used for extinction risk since it also has the phonetic x. However I disagree with the notion that x-risk already means extinction risk, see this paper for instance, or Sam’s comment.
Cool, and thanks for editing the OP. I think the definition of s-risk you give is off though. S-risks are risks of suffering on an astronomical scale (e.g. a future filled with tortured beings). They are sometimes classed as a sub-category of existential risk, but given they are (much) worse than extinction or a drastic curtailment of potential, they are also often talked about separately.
Yes, I agree with that, although perhaps not your first example (I would say that is an existential risk), and for the second one (Factory Farming), to be astronomical in size and therefore an s-risk it would have to be “Space colonisation locking in Factory Farming on a cosmic scale”.
Interesting… So the first scenario is an x-risk that we would want to increase.
edit: I had sort of written about this in a separate post but I was told that scenario one is extinction risk and not x-risk. Slightly confused but I agree with you based on the definition of x-risk I used.
So the first scenario is an x-risk that we would want to increase.
This is getting into button-pushing territory. “Would have wanted to increase were we certain of the future being much worse”, maybe. But I don’t think we can ever be certain enough.
Re extinction vs existential, I’d say it is both (extinction being a subset of existential).
I think x-risk should remain as meaning extinction risk, but “doom” better encompasses what people refer to when they use x-risk colloquially. Doom encompasses x-risk (extinction risk), any permanent drastic curtailment of potential (permanent stagnation or totalitarian lock-in), and s-risk (suffering risk; fates worse than extinction).
EDIT: I fell prey to the original confusion. x-risk was originally existential risk, which encompasses extinction risk and permanent drastic curtailment of future potential. Perhaps x-risk is better as just extinction risk, and ex-risk should be used for existential risk? Doom is still useful as a broader term for common usage (also including s-risk).
Would be good if OP included a definition of existential risk/x-risk!
I really like this! doom feels right for existential risks and x risk definitely could be used for extinction risk since it also has the phonetic x. However I disagree with the notion that x-risk already means extinction risk, see this paper for instance, or Sam’s comment.
Cool, and thanks for editing the OP. I think the definition of s-risk you give is off though. S-risks are risks of suffering on an astronomical scale (e.g. a future filled with tortured beings). They are sometimes classed as a sub-category of existential risk, but given they are (much) worse than extinction or a drastic curtailment of potential, they are also often talked about separately.
Wait so something can be an S-risk but not an X-risk?
Let me know if you agree with the following:
extinction risk but not existential risk: Something that kills everything on earth but it turns on the future would have been worse if we survived
S-risk but not existential risk: Factory Farming
S-risk and existential risk: Stable totalitarianism
Existential risk but not S-risk or extinction risk: Permanent civilization collapse.
All three: AI kills us all and then runs horrible simulations
Yes, I agree with that, although perhaps not your first example (I would say that is an existential risk), and for the second one (Factory Farming), to be astronomical in size and therefore an s-risk it would have to be “Space colonisation locking in Factory Farming on a cosmic scale”.
Interesting… So the first scenario is an x-risk that we would want to increase.
edit: I had sort of written about this in a separate post but I was told that scenario one is extinction risk and not x-risk. Slightly confused but I agree with you based on the definition of x-risk I used.
This is getting into button-pushing territory. “Would have wanted to increase were we certain of the future being much worse”, maybe. But I don’t think we can ever be certain enough.
Re extinction vs existential, I’d say it is both (extinction being a subset of existential).
Perhaps “safety from AGI doom” (or “safety from ASI doom”, or “ASI doom-safety”) is better than “AGI x-safety”?