Just wanted to say I thought this post was great and really appreciate you writing it! I have a hard-to-feed hunger to know what the real situation with nuclear weapons is like, and this is one of the only things to touch it in the past few years. Any other resources you’d recommend?
I’m surprised and heartened to hear some evidence against the “Petrov singlehandedly saved the world” narrative. Is there somewhere I can learn about the other nuclear ‘close calls’ described in the book? (should I just read the book?)
Thanks! Here are some places you might start. (People who have done deeper dives into nuclear risk might have more informed views on what resources would be useful.)
Baum et al., 2018, A Model For The Probability Of Nuclear War makes use of a more comprehensive list of (possible) close calls than I’ve seen elsewhere.
FLI’s timeline of close calls is a more (less?) fun display, which links on to more detailed sources. Note that many of the sources are advocacy groups, and they have a certain spin.
Picking a few case studies that seemed important and following the citations to the most direct historical accounts to better understand how close a call they really were might be a project which would interest you.
I thought this interview with Samantha Neakrase of the Nuclear Threat Initiative was helpful for understanding what things people in the nuclear security community worry about today.
Some broader resources
The probability of nuclear war is only one piece of the puzzle – even a nuclear war would probably not end the world, thankfully. I found the recent Rethink Priorities nuclear risk series (#1, #2, #3, #4, #5, especially #4) very helpful for putting more of the pieces together.
This Q&A with climate scientist Luke Oman gets across some key considerations very efficiently.
I’m also glad that you interpret the discussion of the Petrov incident as ‘some evidence against’. That’s about the level of confidence I intended to convey.
I recently started to feel that celebrating Petrov was a bad choice: he just happened to be in the right place in the right time, and as you say, there were many false positives at the time. Petrov’s actions were important, but they provide no lessons to those who aspire to reduce x-risk.
A better example might be Carl Sagan, who (if I’m correct) researched nuclear winter and succesfully advocated against nuclear weapons by conveying the risk of nuclear winter. This seemed to have contributed to Gorbachov’s conviction to mitigate nuclear war risk. This story has many components EA cares about: doing research to figure out the impacts, advocating with good epistemics, knowing that making an impact is complex, having a strong vision about what is valuable, searching for cooperation, and effectively changing the minds of influential actors.
[Stumbling upon this a year late and sharing a low-confidence hot take, based mostly on Wikipedia]
I think Carl Sagan’s research and advocacy on nuclear winter is definitely an interesting example to consider, but I’m not sure it’s one we should aim to emulate (at least not in its entirety). And I currently have the impression that he probably did not have good epistemics when doing this work.
My impression is that:
Scientists seem quite divided on how likely nuclear winter would be, and what its consequences would be, given various possible nuclear exchanges
Some people seem to think the early study Sagan was involved with deliberately erred towards alarmism in order to advance the cause of disarmament
Evidence from Kuwait oil well fires seems to have not matched the predictions of that study
(I’m hoping to learn more about nuclear winter in the coming months, and would probably have more substantive things to say at that point.)
One reason the Sagan example may be interesting is that it could help us think about how to make—or find ways to avoid having to make—tradeoffs between maintaining good epistemics and influencing things in high-profile, sensitive political areas.
Good points! I broadly agree with your assessment Michael! I’m not at all sure how to judge whether Sagan’s alarmism was intentionally exaggerated or the result of unintentional poor methodology. And then, I think we need to admit that he was making the argument in a (supposedly) pretty impoverished research landscape on topics such as this. It’s only expected that researchers in a new field make mistakes that seem naive once the field is further developed.
I stand by my original point to celebrate Sagan > Petrov though. I’d rather celebrate (and learn from) someone who acted pretty effectively even though it was flawed in a complex situation, than someone who happened to be in the right place at the right time. I’m sill incredibly impressed by Petrov though! It’s just.. hard to replicate his impact.
Just wanted to say I thought this post was great and really appreciate you writing it! I have a hard-to-feed hunger to know what the real situation with nuclear weapons is like, and this is one of the only things to touch it in the past few years. Any other resources you’d recommend?
I’m surprised and heartened to hear some evidence against the “Petrov singlehandedly saved the world” narrative. Is there somewhere I can learn about the other nuclear ‘close calls’ described in the book? (should I just read the book?)
Thanks! Here are some places you might start. (People who have done deeper dives into nuclear risk might have more informed views on what resources would be useful.)
Baum et al., 2018, A Model For The Probability Of Nuclear War makes use of a more comprehensive list of (possible) close calls than I’ve seen elsewhere.
FLI’s timeline of close calls is a more (less?) fun display, which links on to more detailed sources. Note that many of the sources are advocacy groups, and they have a certain spin.
Picking a few case studies that seemed important and following the citations to the most direct historical accounts to better understand how close a call they really were might be a project which would interest you.
I thought this interview with Samantha Neakrase of the Nuclear Threat Initiative was helpful for understanding what things people in the nuclear security community worry about today.
Some broader resources
The probability of nuclear war is only one piece of the puzzle – even a nuclear war would probably not end the world, thankfully. I found the recent Rethink Priorities nuclear risk series (#1, #2, #3, #4, #5, especially #4) very helpful for putting more of the pieces together.
This Q&A with climate scientist Luke Oman gets across some key considerations very efficiently.
I’m also glad that you interpret the discussion of the Petrov incident as ‘some evidence against’. That’s about the level of confidence I intended to convey.
I recently started to feel that celebrating Petrov was a bad choice: he just happened to be in the right place in the right time, and as you say, there were many false positives at the time. Petrov’s actions were important, but they provide no lessons to those who aspire to reduce x-risk.
A better example might be Carl Sagan, who (if I’m correct) researched nuclear winter and succesfully advocated against nuclear weapons by conveying the risk of nuclear winter. This seemed to have contributed to Gorbachov’s conviction to mitigate nuclear war risk. This story has many components EA cares about: doing research to figure out the impacts, advocating with good epistemics, knowing that making an impact is complex, having a strong vision about what is valuable, searching for cooperation, and effectively changing the minds of influential actors.
[Stumbling upon this a year late and sharing a low-confidence hot take, based mostly on Wikipedia]
I think Carl Sagan’s research and advocacy on nuclear winter is definitely an interesting example to consider, but I’m not sure it’s one we should aim to emulate (at least not in its entirety). And I currently have the impression that he probably did not have good epistemics when doing this work.
My impression is that:
Scientists seem quite divided on how likely nuclear winter would be, and what its consequences would be, given various possible nuclear exchanges
Some people seem to think the early study Sagan was involved with deliberately erred towards alarmism in order to advance the cause of disarmament
Evidence from Kuwait oil well fires seems to have not matched the predictions of that study
(I’m hoping to learn more about nuclear winter in the coming months, and would probably have more substantive things to say at that point.)
One reason the Sagan example may be interesting is that it could help us think about how to make—or find ways to avoid having to make—tradeoffs between maintaining good epistemics and influencing things in high-profile, sensitive political areas.
Good points! I broadly agree with your assessment Michael! I’m not at all sure how to judge whether Sagan’s alarmism was intentionally exaggerated or the result of unintentional poor methodology. And then, I think we need to admit that he was making the argument in a (supposedly) pretty impoverished research landscape on topics such as this. It’s only expected that researchers in a new field make mistakes that seem naive once the field is further developed.
I stand by my original point to celebrate Sagan > Petrov though. I’d rather celebrate (and learn from) someone who acted pretty effectively even though it was flawed in a complex situation, than someone who happened to be in the right place at the right time. I’m sill incredibly impressed by Petrov though! It’s just.. hard to replicate his impact.