No worries, I was just curious—I’ve tried to find data on things like projections of lives lost in combat between the US and China and can’t find anything good (best I found was a Rand study from a few years ago but it didn’t really give projections of actual deaths) so was curious if you had gotten your hands on that data to make your projections. Sorry for the misunderstanding, I had assumed China/US conflict but makes sense—probably anyone with nuclear capabilities who gets into a serious foreign entanglement will create an extremely dangerous situation for the world.
probably anyone with nuclear capabilities who gets into a serious foreign entanglement will create an extremely dangerous situation for the world.
I’d agree with this. But partly due to what nuclear capabilities correlates with, rather than solely due to the nuclear capabilities themselves. Off the top of my head, I see at least 4 mechanisms by which great power war could reduce the expected value of the long-term future:
Risk of nuclear war and thereby of nuclear winter (this seems to be the implied focus of your comment)
Increased chances of unsafe development of emerging technologies (or, similarly, less willingness/ability to cooperate on ensuring that technological development proceeds safely)
As this post notes, “In addition, mistrust between major powers makes it harder for them to coordinate on arms control or ensure the safe use of new technologies.”
Increased chance of robust totalitarianism (analogous to how it seems plausible that, had the Nazis won WWII, that regime would’ve spread fairly globally and lasted fairly a long time)
Speaking as very much a non-expert, all 4 of those mechanisms seem important to me, without one of them standing out as far more important than the others. (Though I think I’d very weakly expect the first two to be more important than the last two.) If that’s true, and if someone had previously focused primarily on the risks of nuclear winter, this might suggest that person should increase their level of concern about great power conflict, including about conflicts that are very unlikely to result in nuclear weapons use.
(I assume there’s been EA and non-EA work on this general topic that I haven’t seen—this is just my quick take.)
No worries, I was just curious—I’ve tried to find data on things like projections of lives lost in combat between the US and China and can’t find anything good (best I found was a Rand study from a few years ago but it didn’t really give projections of actual deaths) so was curious if you had gotten your hands on that data to make your projections. Sorry for the misunderstanding, I had assumed China/US conflict but makes sense—probably anyone with nuclear capabilities who gets into a serious foreign entanglement will create an extremely dangerous situation for the world.
I’d agree with this. But partly due to what nuclear capabilities correlates with, rather than solely due to the nuclear capabilities themselves. Off the top of my head, I see at least 4 mechanisms by which great power war could reduce the expected value of the long-term future:
Risk of nuclear war and thereby of nuclear winter (this seems to be the implied focus of your comment)
Increased chances of unsafe development of emerging technologies (or, similarly, less willingness/ability to cooperate on ensuring that technological development proceeds safely)
As this post notes, “In addition, mistrust between major powers makes it harder for them to coordinate on arms control or ensure the safe use of new technologies.”
Increased chance of robust totalitarianism (analogous to how it seems plausible that, had the Nazis won WWII, that regime would’ve spread fairly globally and lasted fairly a long time)
Residual chance of various bad things if there’s a violent disruption of current trends, which seem to be unusually good (see The long-term significance of reducing global catastrophic risks by Beckstead)
Speaking as very much a non-expert, all 4 of those mechanisms seem important to me, without one of them standing out as far more important than the others. (Though I think I’d very weakly expect the first two to be more important than the last two.) If that’s true, and if someone had previously focused primarily on the risks of nuclear winter, this might suggest that person should increase their level of concern about great power conflict, including about conflicts that are very unlikely to result in nuclear weapons use.
(I assume there’s been EA and non-EA work on this general topic that I haven’t seen—this is just my quick take.)