On the question of priors, I liked AGI Catastrophe and Takeover: Some Reference Class-Based Priors. It is unclear to me whether extinction risk has increased in the last 100 years. I estimated an annual nuclear extinction risk of 5.93*10^-12, which is way lower than the prior for wild mammals of 10^-6.
I see in your comment on that post, you say “human extinction would not necessarily be an existential catastrophe” and “So, if advanced AI, as the most powerful entity on Earth, were to cause human extinction, I guess existential risk would be negligible on priors?”. To be clear: what I’m interested in here is human extinction (not any broader conception of “existential catastrophe”), and the bet is about that.
See my comment on that post for why I don’t agree. I agree nuclear extinction risk is low (but probably not that low)[1]. ASI is really the only thing that is likely to kill every last human (and I think it is quite likely to do that given it will be way more powerful than anything else[2]).
On the question of priors, I liked AGI Catastrophe and Takeover: Some Reference Class-Based Priors. It is unclear to me whether extinction risk has increased in the last 100 years. I estimated an annual nuclear extinction risk of 5.93*10^-12, which is way lower than the prior for wild mammals of 10^-6.
I see in your comment on that post, you say “human extinction would not necessarily be an existential catastrophe” and “So, if advanced AI, as the most powerful entity on Earth, were to cause human extinction, I guess existential risk would be negligible on priors?”. To be clear: what I’m interested in here is human extinction (not any broader conception of “existential catastrophe”), and the bet is about that.
Agreed.
See my comment on that post for why I don’t agree. I agree nuclear extinction risk is low (but probably not that low)[1]. ASI is really the only thing that is likely to kill every last human (and I think it is quite likely to do that given it will be way more powerful than anything else[2]).
But too be clear, global catastrophic / civilisational collapse risk from nuclear is relatively high (these often get conflated with “extinction”).
Not only do I think it will kill every last human, I think it’s quite likely it will wipe out all known carbon-based life.