Yeah, I understood this. This is why I’ve focused on a particular case for it valuing nature which I think could be compatible with wiping out humans (not going into the other cases that Ryan discusses, which I think would be more likely to involve keeping humans around). I needed to bring in the point about humans surviving to address the counterargument “oh but in that case probably humans would survive too” (which I think is probable but not certain). Anyway maybe I was slightly overstating the point? Like I agree that in this scenario the most likely outcome is that nature doesn’t meaningfully survive. But it sounded like you were arguing that it was obvious that nature wouldn’t survive, which doesn’t sound right to me.
I don’t claim it’s impossible that nature survives an AI apocalypse which kills off humanity, but I do think it’s an extremely thin sliver of the outcome space (<0.1%). What odds would you assign to this?
Ok, I guess around 1%? But this is partially driven by model uncertainty; I don’t actually feel confident your number is too small.
I’m much higher (tens of percentage points) on “chance nature survives conditional on most humans being wiped out”; it’s just that most of these scenarios involve some small number of humans being kept around so it’s not literal extinction. (And I think these scenarios are a good part of things people intuitively imagine and worry about when you talk about human extinction from AI, even though the label isn’t literally applicable.)
Thanks for asking explicitly about the odds, I might not have noticed this distinction otherwise.
Yeah, I understood this. This is why I’ve focused on a particular case for it valuing nature which I think could be compatible with wiping out humans (not going into the other cases that Ryan discusses, which I think would be more likely to involve keeping humans around). I needed to bring in the point about humans surviving to address the counterargument “oh but in that case probably humans would survive too” (which I think is probable but not certain). Anyway maybe I was slightly overstating the point? Like I agree that in this scenario the most likely outcome is that nature doesn’t meaningfully survive. But it sounded like you were arguing that it was obvious that nature wouldn’t survive, which doesn’t sound right to me.
I don’t claim it’s impossible that nature survives an AI apocalypse which kills off humanity, but I do think it’s an extremely thin sliver of the outcome space (<0.1%). What odds would you assign to this?
Ok, I guess around 1%? But this is partially driven by model uncertainty; I don’t actually feel confident your number is too small.
I’m much higher (tens of percentage points) on “chance nature survives conditional on most humans being wiped out”; it’s just that most of these scenarios involve some small number of humans being kept around so it’s not literal extinction. (And I think these scenarios are a good part of things people intuitively imagine and worry about when you talk about human extinction from AI, even though the label isn’t literally applicable.)
Thanks for asking explicitly about the odds, I might not have noticed this distinction otherwise.