Yes, I agree with that, although perhaps not your first example (I would say that is an existential risk), and for the second one (Factory Farming), to be astronomical in size and therefore an s-risk it would have to be “Space colonisation locking in Factory Farming on a cosmic scale”.
Interesting… So the first scenario is an x-risk that we would want to increase.
edit: I had sort of written about this in a separate post but I was told that scenario one is extinction risk and not x-risk. Slightly confused but I agree with you based on the definition of x-risk I used.
So the first scenario is an x-risk that we would want to increase.
This is getting into button-pushing territory. “Would have wanted to increase were we certain of the future being much worse”, maybe. But I don’t think we can ever be certain enough.
Re extinction vs existential, I’d say it is both (extinction being a subset of existential).
Yes, I agree with that, although perhaps not your first example (I would say that is an existential risk), and for the second one (Factory Farming), to be astronomical in size and therefore an s-risk it would have to be “Space colonisation locking in Factory Farming on a cosmic scale”.
Interesting… So the first scenario is an x-risk that we would want to increase.
edit: I had sort of written about this in a separate post but I was told that scenario one is extinction risk and not x-risk. Slightly confused but I agree with you based on the definition of x-risk I used.
This is getting into button-pushing territory. “Would have wanted to increase were we certain of the future being much worse”, maybe. But I don’t think we can ever be certain enough.
Re extinction vs existential, I’d say it is both (extinction being a subset of existential).