If one takes Toby Ord’s x-risk estimates (from here), but adds some uncertainty, one gets: this Guesstimate. X-risk ranges from 0.1 to 0.3, with a point estimate of 0.19, or 1 in 5 (vs 1 in 6 in the book).
I personally would add more probability to unforeseen natural risk and unforeseen anthropocentric risk
The uncertainty regarding AI risk is driving most of the overall uncertainty.
If one takes Toby Ord’s x-risk estimates (from here), but adds some uncertainty, one gets: this Guesstimate. X-risk ranges from 0.1 to 0.3, with a point estimate of 0.19, or 1 in 5 (vs 1 in 6 in the book).
I personally would add more probability to unforeseen natural risk and unforeseen anthropocentric risk
The uncertainty regarding AI risk is driving most of the overall uncertainty.