I don’t claim it’s impossible that nature survives an AI apocalypse which kills off humanity, but I do think it’s an extremely thin sliver of the outcome space (<0.1%). What odds would you assign to this?
Ok, I guess around 1%? But this is partially driven by model uncertainty; I don’t actually feel confident your number is too small.
I’m much higher (tens of percentage points) on “chance nature survives conditional on most humans being wiped out”; it’s just that most of these scenarios involve some small number of humans being kept around so it’s not literal extinction. (And I think these scenarios are a good part of things people intuitively imagine and worry about when you talk about human extinction from AI, even though the label isn’t literally applicable.)
Thanks for asking explicitly about the odds, I might not have noticed this distinction otherwise.
I don’t claim it’s impossible that nature survives an AI apocalypse which kills off humanity, but I do think it’s an extremely thin sliver of the outcome space (<0.1%). What odds would you assign to this?
Ok, I guess around 1%? But this is partially driven by model uncertainty; I don’t actually feel confident your number is too small.
I’m much higher (tens of percentage points) on “chance nature survives conditional on most humans being wiped out”; it’s just that most of these scenarios involve some small number of humans being kept around so it’s not literal extinction. (And I think these scenarios are a good part of things people intuitively imagine and worry about when you talk about human extinction from AI, even though the label isn’t literally applicable.)
Thanks for asking explicitly about the odds, I might not have noticed this distinction otherwise.