First, I am not an academic in this area, and as such my observations will not be strictly bound to the models in question, as I think there are other risks that have not been explicitly examined that are having an effect. (Also, the current situation in Ukraine appears to represent a substantial risk of an unintended wider war which could ignite by a fog of war or a misread of the enemy event, and then there is the problem of echo-chambers reinforcing poor data or poor interpretation of good data.)
“No plan of operations extends with certainty beyond the first encounter with the enemy’s main strength.” - Moltke the Elder
The chaotic nature of war is a major problem, and when delivery times of nuclear weapons range from less than 5 minutes for closely situated SLBMs to 30 minutes for ICBMs to hours for bombers, the ability to make the right call under extreme stress is a serious problem.
We also need to look at close calls and the role of luck. The Cuban Missile Crisis did not go nuclear because one thread held—Vasili Arkhipov’s vote to not launch a nuclear torpedo. Able Archer 83 was a case where at a time of heightened tensions a military exercise by NATO was being interpreted in the Soviet Union as a ruse of war. (We cannot control or anticipate the mindset of our opponent who may be predisposed to assume the worst about an adversary’s intensions.) And there are the possibilities of technical issues in detections systems resulting in false positives such as the case in September 26th, 1983 when cloud reflections sent false indications of ICBM launches in the United States by a malfunctioning Soviet launch detection system. Again, one thread probably held in that Stanislav Petrov probably acted above his pay grade by not simply passing the indications further up the chain of command because the pattern that he saw was inconsistent with what he expected a first strike would look like.
So one of many historical weaknesses have already been shown that could have caused a tense situation to tip into a nuclear war despite the desire for it to not do so by any sane actor. I don’t think this is taken into account by the model—and it may be very difficult to do so. Also, the number of data points that we have since the beginning of the nuclear era may be insufficient to formulate a suitable model. And there is also the problem of proliferation to a wider number of actors that would increase probabilities of a nuclear exchange, and changes of mindset around the use of tactical nuclear warheads. (Russia for example has a doctrine that permits first tactical use, and Putin has threatened nuclear escalation in the Ukraine conflict.)
Again, I am not an academic, and someone with greater knowledge and expertise can probably poke holes in what is simply a non-expert looking at patterns and the behaviour of people as individuals and in groups at times of extreme stress, and the problems of technical malfunction. (Lastly, the aggravating effects of climate change over time will probably also change the calculus of catastrophic war, and that also does not appear to have been factored in, and what of Graham Allison’s “Thucydides Trap”?)
I will be most interested to follow this discussion further as it is of much more than academic interest for obvious reasons.
First, I am not an academic in this area, and as such my observations will not be strictly bound to the models in question, as I think there are other risks that have not been explicitly examined that are having an effect. (Also, the current situation in Ukraine appears to represent a substantial risk of an unintended wider war which could ignite by a fog of war or a misread of the enemy event, and then there is the problem of echo-chambers reinforcing poor data or poor interpretation of good data.)
“No plan of operations extends with certainty beyond the first encounter with the enemy’s main strength.” - Moltke the Elder
The chaotic nature of war is a major problem, and when delivery times of nuclear weapons range from less than 5 minutes for closely situated SLBMs to 30 minutes for ICBMs to hours for bombers, the ability to make the right call under extreme stress is a serious problem.
We also need to look at close calls and the role of luck. The Cuban Missile Crisis did not go nuclear because one thread held—Vasili Arkhipov’s vote to not launch a nuclear torpedo. Able Archer 83 was a case where at a time of heightened tensions a military exercise by NATO was being interpreted in the Soviet Union as a ruse of war. (We cannot control or anticipate the mindset of our opponent who may be predisposed to assume the worst about an adversary’s intensions.) And there are the possibilities of technical issues in detections systems resulting in false positives such as the case in September 26th, 1983 when cloud reflections sent false indications of ICBM launches in the United States by a malfunctioning Soviet launch detection system. Again, one thread probably held in that Stanislav Petrov probably acted above his pay grade by not simply passing the indications further up the chain of command because the pattern that he saw was inconsistent with what he expected a first strike would look like.
So one of many historical weaknesses have already been shown that could have caused a tense situation to tip into a nuclear war despite the desire for it to not do so by any sane actor. I don’t think this is taken into account by the model—and it may be very difficult to do so. Also, the number of data points that we have since the beginning of the nuclear era may be insufficient to formulate a suitable model. And there is also the problem of proliferation to a wider number of actors that would increase probabilities of a nuclear exchange, and changes of mindset around the use of tactical nuclear warheads. (Russia for example has a doctrine that permits first tactical use, and Putin has threatened nuclear escalation in the Ukraine conflict.)
Again, I am not an academic, and someone with greater knowledge and expertise can probably poke holes in what is simply a non-expert looking at patterns and the behaviour of people as individuals and in groups at times of extreme stress, and the problems of technical malfunction. (Lastly, the aggravating effects of climate change over time will probably also change the calculus of catastrophic war, and that also does not appear to have been factored in, and what of Graham Allison’s “Thucydides Trap”?)
I will be most interested to follow this discussion further as it is of much more than academic interest for obvious reasons.