Earlier this year Good Judgment superforecasters (in nonpublic data) gave a median probability of 2% that a state actor would make a nuclear weapon attack killing at least 1 person before January 1, 2021. Conditional on that happening they gave an 84% probability of 1-9 weapons detonating, 13% to 10-99, 2% to 100-999, and 1% to 100 or more.
Here’s a survey of national security experts which gave a median 5% chance of a nuclear great power conflict killing at least 80 million people over 20 years, although some of the figures in the tables look questionable (mean less than half of median).
It’s not clear how much one should trust these groups in this area. Over a longer time scale I would expect the numbers to be higher, since there is information that we are currently not in a Cold War (or hot war!), and various technological and geopolitical factors (e.g. the shift to multipolar military power and the rise of China) may drive it up.
Do you have private access to the Good Judgement data? I’ve been thinking before about how it would be good to get superforecasters to answer such questions but didn’t know of a way to access the results of previous questions.
(Though there is the question of how much superforecasters’ previous track record on short-term questions translates to success on longer-term questions.)
GJ results (as opposed Good Judgment Open) aren’t public, but Open Phil has an account with them. This is from a batch of nuclear war probability questions I suggested that Open Phil commission to help assess nuclear risk interventions.
Hi Carl, is there any progress on this end in the past year? I’d be very interested to see x-risk relevant forecasts (currently working on a related project).
Earlier this year Good Judgment superforecasters (in nonpublic data) gave a median probability of 2% that a state actor would make a nuclear weapon attack killing at least 1 person before January 1, 2021. Conditional on that happening they gave an 84% probability of 1-9 weapons detonating, 13% to 10-99, 2% to 100-999, and 1% to 100 or more.
Here’s a survey of national security experts which gave a median 5% chance of a nuclear great power conflict killing at least 80 million people over 20 years, although some of the figures in the tables look questionable (mean less than half of median).
It’s not clear how much one should trust these groups in this area. Over a longer time scale I would expect the numbers to be higher, since there is information that we are currently not in a Cold War (or hot war!), and various technological and geopolitical factors (e.g. the shift to multipolar military power and the rise of China) may drive it up.
Do you have private access to the Good Judgement data? I’ve been thinking before about how it would be good to get superforecasters to answer such questions but didn’t know of a way to access the results of previous questions.
(Though there is the question of how much superforecasters’ previous track record on short-term questions translates to success on longer-term questions.)
GJ results (as opposed Good Judgment Open) aren’t public, but Open Phil has an account with them. This is from a batch of nuclear war probability questions I suggested that Open Phil commission to help assess nuclear risk interventions.
This is really cool, Carl. Thanks for sharing. Do superforecasters ever make judgments about other x-risks?
Not by default, but I hope to get more useful forecasts that are EA action-relevant in the future performed and published.
Hi Carl, is there any progress on this end in the past year? I’d be very interested to see x-risk relevant forecasts (currently working on a related project).
Shouldn’t the 1% be “1000 or more”?