I think I gave the impression that I’m making a more expansive claim than I actually mean to make, and will edit the post to clarify this. The main reason I wanted to write this post is that a lot of people, including a number in the EA community, start with the conception that a nuclear war is relatively likely to kill everyone, either for nebulous reason or because of nuclear winter specifically. I know most people who’ve examined it know this is wrong, but I wanted that information to be laid out pretty clearly, so someone could get a summary of this argument. I think that’s just the beginning in assessing existential risk from nuclear war, and I really wouldn’t want people to read my post and walk away thinking “nuclear war is nothing to worry about from a longtermist perspective.”
I agree that “We know that one type of existential risk from nuclear war is very small, but we don’t really have a good idea for how large total existential risk from nuclear war”. I’m planning to follow this post with a discussion of existential risks from compounding risks like nuclear war, climate change, biotech accidents, bioweapons, & others.
It feels like I disagree with you on the likelihood that a collapse induced by nuclear war would lead to permanent loss of humanity’s potential / eventual extinction. I currently think humans would retain the most significant basic survival technologies following a collapse and then reacquire lost technological capacities relatively quickly. (I discussed this investigation here though not in depth). I’m planning too write this up as part of my compounding risks post or as a separate one.
Agreed that it’s very hard to know the sign on a huge history-altering event, whether it’s a nuclear war or covid.
The main reason I wanted to write this post is that a lot of people, including a number in the EA community, start with the conception that a nuclear war is relatively likely to kill everyone, either for nebulous reason or because of nuclear winter specifically.
This agrees with my impression, and I do think it’s valuable to correct this misconception. (Sorry, I think it would have been better and clearer if I had said this in my first comment.) This is why I favor work with somewhat changed messaging/emphasis over no work.
It feels like I disagree with you on the likelihood that a collapse induced by nuclear war would lead to permanent loss of humanity’s potential / eventual extinction.
I’m not sure we disagree. My current best guess is that most plausible kinds of civilizational collapse wouldn’t be an existential risk, including collapse caused by nuclear war. (For basically the reasons you mention.) However, I feel way less confident about this than about the claim that nuclear war wouldn’t immediately kill everyone. In any case, my point was not that I in fact think this is likely, but just that it’s sufficiently non-obvious that it would be costly if people walked away with the impression that it’s definitely not a problem.
I’m planning to follow this post with a discussion of existential risks from compounding risks like nuclear war, climate change, biotech accidents, bioweapons, & others.
This sounds like a very valuable topic, and I’m excited to see more work on it.
FWIW, my guess is that you’re already planning to do this, but I think it could be valuable to carefully consider information hazards before publishing on this [both because of messaging issues similar to the one we discussed here and potentially on the substance, e.g. unclear if it’d be good to describe in detail “here is how this combination of different hazards could kill everyone”]. So I think e.g. asking a bunch of people what they think prior to publication could be good. (I’d be happy to review a post prior to publication, though I’m not sure if I’m particularly qualified.)
FWIW, my guess is that you’re already planning to do this, but I think it could be valuable to carefully consider information hazards before publishing on this [both because of messaging issues similar to the one we discussed here and potentially on the substance, e.g. unclear if it’d be good to describe in detail “here is how this combination of different hazards could kill everyone”]. So I think e.g. asking a bunch of people what they think prior to publication could be good. (I’d be happy to review a post prior to publication, though I’m not sure if I’m particularly qualified.)
Yes, I was planning to get review prior to publishing this. In general when it comes to risks from biotechnology, I’m trying to follow the principles we developed here: https://www.lesswrong.com/posts/ygFc4caQ6Nws62dSW/bioinfohazards I’d be excited to see, or help workshop, better guidance for navigating information hazards in this space in the future.
I think I gave the impression that I’m making a more expansive claim than I actually mean to make, and will edit the post to clarify this.
Have you made these edits yet, or is this still on the to-do list? Having just read the post, I strongly agree with Max’s assessment, and still think readers could very easily round this post’s claims off to “Nuclear war is very unlikely to be a big deal for longtermists”. The key changes that I’d see as valuable would be:
changing the title (maybe to something like “Nuclear war is unlikely to directly cause human extinction”)
explicitly saying something in the introductory part about how the possibilities of nuclear war causing indirect extinction or other existential catastrophes/trajectory changes are beyond the scope of this post
(There may of course also be other changes that would accomplish similar results)
I also do think that this post contains quite valuable info. And I’d agree that there aresome people, including in the EA community, who seem much too confident that nuclear war would directly cause extinction (though, like Max, I’m not aware of anyone who meets that description and has looked into the topic much).
So if this post had had roughly those tweaks / when you make roughly those tweaks, I’d think it’d be quite valuable. (Unfortunately, in its present form, I worry that the post might create more confusion than it resolves.)
I’d also be excited to see the sort of future work you describe on compounding risks and recovery from collapse! I think those topics are plausibly important and sorely under-explored.
”By a full-scale war, I mean a nuclear exchange between major world powers, such as the US, Russia, and China, using the complete arsenals of each country. The total number of warheads today (14,000) is significantly smaller than during the height of the cold war (70,000). While extinction from nuclear war is unlikely today, it may become more likely if significantly more warheads are deployed or if designs of weapons change significantly.”
I also think indirect extinction from nuclear war is unlikely, but I would like to address this more in a future post. I disagree that additional clarifications are needed. I think people made these points clearly in the comments, and that anyone motivated to investigate this area seriously can read those. If you want to try to doublecrux on why we disagree here I’d be up for that, though on a call might be preferable for saving time.
I think I gave the impression that I’m making a more expansive claim than I actually mean to make, and will edit the post to clarify this. The main reason I wanted to write this post is that a lot of people, including a number in the EA community, start with the conception that a nuclear war is relatively likely to kill everyone, either for nebulous reason or because of nuclear winter specifically. I know most people who’ve examined it know this is wrong, but I wanted that information to be laid out pretty clearly, so someone could get a summary of this argument. I think that’s just the beginning in assessing existential risk from nuclear war, and I really wouldn’t want people to read my post and walk away thinking “nuclear war is nothing to worry about from a longtermist perspective.”
I agree that “We know that one type of existential risk from nuclear war is very small, but we don’t really have a good idea for how large total existential risk from nuclear war”. I’m planning to follow this post with a discussion of existential risks from compounding risks like nuclear war, climate change, biotech accidents, bioweapons, & others.
It feels like I disagree with you on the likelihood that a collapse induced by nuclear war would lead to permanent loss of humanity’s potential / eventual extinction. I currently think humans would retain the most significant basic survival technologies following a collapse and then reacquire lost technological capacities relatively quickly. (I discussed this investigation here though not in depth). I’m planning too write this up as part of my compounding risks post or as a separate one.
Agreed that it’s very hard to know the sign on a huge history-altering event, whether it’s a nuclear war or covid.
This agrees with my impression, and I do think it’s valuable to correct this misconception. (Sorry, I think it would have been better and clearer if I had said this in my first comment.) This is why I favor work with somewhat changed messaging/emphasis over no work.
I’m not sure we disagree. My current best guess is that most plausible kinds of civilizational collapse wouldn’t be an existential risk, including collapse caused by nuclear war. (For basically the reasons you mention.) However, I feel way less confident about this than about the claim that nuclear war wouldn’t immediately kill everyone. In any case, my point was not that I in fact think this is likely, but just that it’s sufficiently non-obvious that it would be costly if people walked away with the impression that it’s definitely not a problem.
This sounds like a very valuable topic, and I’m excited to see more work on it.
FWIW, my guess is that you’re already planning to do this, but I think it could be valuable to carefully consider information hazards before publishing on this [both because of messaging issues similar to the one we discussed here and potentially on the substance, e.g. unclear if it’d be good to describe in detail “here is how this combination of different hazards could kill everyone”]. So I think e.g. asking a bunch of people what they think prior to publication could be good. (I’d be happy to review a post prior to publication, though I’m not sure if I’m particularly qualified.)
Yes, I was planning to get review prior to publishing this. In general when it comes to risks from biotechnology, I’m trying to follow the principles we developed here: https://www.lesswrong.com/posts/ygFc4caQ6Nws62dSW/bioinfohazards I’d be excited to see, or help workshop, better guidance for navigating information hazards in this space in the future.
Have you made these edits yet, or is this still on the to-do list? Having just read the post, I strongly agree with Max’s assessment, and still think readers could very easily round this post’s claims off to “Nuclear war is very unlikely to be a big deal for longtermists”. The key changes that I’d see as valuable would be:
changing the title (maybe to something like “Nuclear war is unlikely to directly cause human extinction”)
explicitly saying something in the introductory part about how the possibilities of nuclear war causing indirect extinction or other existential catastrophes/trajectory changes are beyond the scope of this post
(There may of course also be other changes that would accomplish similar results)
I also do think that this post contains quite valuable info. And I’d agree that there are some people, including in the EA community, who seem much too confident that nuclear war would directly cause extinction (though, like Max, I’m not aware of anyone who meets that description and has looked into the topic much).
So if this post had had roughly those tweaks / when you make roughly those tweaks, I’d think it’d be quite valuable. (Unfortunately, in its present form, I worry that the post might create more confusion than it resolves.)
I’d also be excited to see the sort of future work you describe on compounding risks and recovery from collapse! I think those topics are plausibly important and sorely under-explored.
The part I added was:
”By a full-scale war, I mean a nuclear exchange between major world powers, such as the US, Russia, and China, using the complete arsenals of each country. The total number of warheads today (14,000) is significantly smaller than during the height of the cold war (70,000). While extinction from nuclear war is unlikely today, it may become more likely if significantly more warheads are deployed or if designs of weapons change significantly.”
I also think indirect extinction from nuclear war is unlikely, but I would like to address this more in a future post. I disagree that additional clarifications are needed. I think people made these points clearly in the comments, and that anyone motivated to investigate this area seriously can read those. If you want to try to doublecrux on why we disagree here I’d be up for that, though on a call might be preferable for saving time.