Last I heard Nate Soares (at MIRI) has an all-things-considered probability around 80%, and Evan Hubinger recently gave ~80% too. Nate’s reasoning is here, and he would probably also endorse this list of challenges.
I think you don’t really have to have any crazy beliefs to have probabilities above 50%, just
higher confidence in the core arguments being correct, such that you think there are concrete problems that probably need to be solved to avoid AI takeover
a prior that is not overwhelmingly low, despite some previous mechanisms for catastrophe like overpopulation and nuclear war being avoidable. The world is allowed to kill you.
observation that not much progress has been made on the problem so far, and belief that this will not massively speed up as we get closer to AGI
Believing there are multiple independent core problems we don’t have traction on, or that some problems are likely to take serial time or multiple attempts that we don’t have, can drive this probability higher.
Last I heard Nate Soares (at MIRI) has an all-things-considered probability around 80%, and Evan Hubinger recently gave ~80% too. Nate’s reasoning is here, and he would probably also endorse this list of challenges.
I think you don’t really have to have any crazy beliefs to have probabilities above 50%, just
higher confidence in the core arguments being correct, such that you think there are concrete problems that probably need to be solved to avoid AI takeover
a prior that is not overwhelmingly low, despite some previous mechanisms for catastrophe like overpopulation and nuclear war being avoidable. The world is allowed to kill you.
observation that not much progress has been made on the problem so far, and belief that this will not massively speed up as we get closer to AGI
Believing there are multiple independent core problems we don’t have traction on, or that some problems are likely to take serial time or multiple attempts that we don’t have, can drive this probability higher.
Adding Nate Soares’s “AGI ruin scenarios are likely [...]”