I don’t think we should defer too much to Ord’s x-risk estimates, but since we’re talking about them here goes:
Ord’s estimate of total natural risk is 1 in 10,000, which is 160 times less than the total anthropogenic risk (1 in 6).
Risk from engineered pandemics (1 in 30) is within an order of magnitude of risk from misaligned AI (1 in 10), so it’s hardly a rounding error (although simeon_c recently argued that Ord “vastly overestimates” biorisk).
No, you’re probably thinking of anthropogenic risk. AI is 1⁄10, whereas the total estimated x-risk is 1⁄6.
I don’t think we should defer too much to Ord’s x-risk estimates, but since we’re talking about them here goes:
Ord’s estimate of total natural risk is 1 in 10,000, which is 160 times less than the total anthropogenic risk (1 in 6).
Risk from engineered pandemics (1 in 30) is within an order of magnitude of risk from misaligned AI (1 in 10), so it’s hardly a rounding error (although simeon_c recently argued that Ord “vastly overestimates” biorisk).
Ah yes that’s right. Still AI contributes the majority of x-risk.