Cool, and thanks for editing the OP. I think the definition of s-risk you give is off though. S-risks are risks of suffering on an astronomical scale (e.g. a future filled with tortured beings). They are sometimes classed as a sub-category of existential risk, but given they are (much) worse than extinction or a drastic curtailment of potential, they are also often talked about separately.
Yes, I agree with that, although perhaps not your first example (I would say that is an existential risk), and for the second one (Factory Farming), to be astronomical in size and therefore an s-risk it would have to be “Space colonisation locking in Factory Farming on a cosmic scale”.
Interesting… So the first scenario is an x-risk that we would want to increase.
edit: I had sort of written about this in a separate post but I was told that scenario one is extinction risk and not x-risk. Slightly confused but I agree with you based on the definition of x-risk I used.
So the first scenario is an x-risk that we would want to increase.
This is getting into button-pushing territory. “Would have wanted to increase were we certain of the future being much worse”, maybe. But I don’t think we can ever be certain enough.
Re extinction vs existential, I’d say it is both (extinction being a subset of existential).
Cool, and thanks for editing the OP. I think the definition of s-risk you give is off though. S-risks are risks of suffering on an astronomical scale (e.g. a future filled with tortured beings). They are sometimes classed as a sub-category of existential risk, but given they are (much) worse than extinction or a drastic curtailment of potential, they are also often talked about separately.
Wait so something can be an S-risk but not an X-risk?
Let me know if you agree with the following:
extinction risk but not existential risk: Something that kills everything on earth but it turns on the future would have been worse if we survived
S-risk but not existential risk: Factory Farming
S-risk and existential risk: Stable totalitarianism
Existential risk but not S-risk or extinction risk: Permanent civilization collapse.
All three: AI kills us all and then runs horrible simulations
Yes, I agree with that, although perhaps not your first example (I would say that is an existential risk), and for the second one (Factory Farming), to be astronomical in size and therefore an s-risk it would have to be “Space colonisation locking in Factory Farming on a cosmic scale”.
Interesting… So the first scenario is an x-risk that we would want to increase.
edit: I had sort of written about this in a separate post but I was told that scenario one is extinction risk and not x-risk. Slightly confused but I agree with you based on the definition of x-risk I used.
This is getting into button-pushing territory. “Would have wanted to increase were we certain of the future being much worse”, maybe. But I don’t think we can ever be certain enough.
Re extinction vs existential, I’d say it is both (extinction being a subset of existential).