I know if Ord has his estimate of 6% extinction in the next 100 years, but I don’t know of attempts to extrapolate this or other estimates.
This doesn’t change the substance of your point, but Ord estimates a one-in-six chance of an existential catastrophe this century.
Concerning extrapolation of this particular estimate, I think it’s much clearer here that this would be incorrect, since the bulk of the risk in Toby’s breakdown comes from AI, which is a step risk rather than a state risk.
This doesn’t change the substance of your point, but Ord estimates a one-in-six chance of an existential catastrophe this century.
Concerning extrapolation of this particular estimate, I think it’s much clearer here that this would be incorrect, since the bulk of the risk in Toby’s breakdown comes from AI, which is a step risk rather than a state risk.