Nitpick. I think you meant bioterrorism, not terrorism which includes more data.
Thanks! Fixed.
I don’t know the nuclear field well, so don’t have much to add. If I’m following your comment though, it seems like you have your own estimate of the chance of nuclear war raising 47+ Tg of soot, and on the basis of that infer the implied probability supers give to extinction conditional on such a war. Why not instead infer that supers have a higher forecast of nuclear war than your 0.39% by 2100? E.g. a ~1.6% chance of nuclear war with 47+ Tg and a 5% chance of extinction conditional on it. I may be misunderstanding your comment. Though to be clear, I think it’s very possible the supers were not thinking things through in similar detail to you—there were a fair number of questions in the XPT.
I am left wondering about how much higher extinction risk will become accounting not only for increased capabilities, but also increased safety
I don’t think I follow this sentence? Is it that one might expect future advances in defensive biotech/other tech to counterbalance offensive tech development, and that without a detailed quant model you expect the defensive side to be under-counted?
Why not instead infer that supers have a higher forecast of nuclear war than your 0.39% by 2100? E.g. a ~1.6% chance of nuclear war with 47+ Tg and a 5% chance of extinction conditional on it.
Fair point! Here is another way of putting my point. I estimated a probability of 3.29*10^-6 for a 50 % population loss due to the climatic effects of nuclear war before 2050, so around 0.001 % (= 3.29*10^-6*75/25) before 2100. Superforecasters’ 0.074 % nuclear extinction risk before 2100 is 74 times my risk for a 50 % population loss due to climatic effects. My estimate may be off to some extent, and I only focussed on the climatic effects, not the indirect deaths caused by infrastructure destruction, but my best guess has to be many OOMs off for superforecasters prediction to be in the right OOM. This makes me believe superforecasters’ are overestimating nuclear extinction risk.
Is it that one might expect future advances in defensive biotech/other tech to counterbalance offensive tech development [?]
Yes, in the same way that the risk of global warming is often overestimated due to neglecting adaptation.
without a detailed quant model you expect the defensive side to be under-counted?
I expect the defensive side to be under-counted, but not necessarily due to lack of quantitative models. However, I think using quantitative models makes it less likely that the defensive side is under-counted. I have not thought much about this; I am just expressing my intuitions.
Thanks! Fixed.
I don’t know the nuclear field well, so don’t have much to add. If I’m following your comment though, it seems like you have your own estimate of the chance of nuclear war raising 47+ Tg of soot, and on the basis of that infer the implied probability supers give to extinction conditional on such a war. Why not instead infer that supers have a higher forecast of nuclear war than your 0.39% by 2100? E.g. a ~1.6% chance of nuclear war with 47+ Tg and a 5% chance of extinction conditional on it. I may be misunderstanding your comment. Though to be clear, I think it’s very possible the supers were not thinking things through in similar detail to you—there were a fair number of questions in the XPT.
I don’t think I follow this sentence? Is it that one might expect future advances in defensive biotech/other tech to counterbalance offensive tech development, and that without a detailed quant model you expect the defensive side to be under-counted?
Fair point! Here is another way of putting my point. I estimated a probability of 3.29*10^-6 for a 50 % population loss due to the climatic effects of nuclear war before 2050, so around 0.001 % (= 3.29*10^-6*75/25) before 2100. Superforecasters’ 0.074 % nuclear extinction risk before 2100 is 74 times my risk for a 50 % population loss due to climatic effects. My estimate may be off to some extent, and I only focussed on the climatic effects, not the indirect deaths caused by infrastructure destruction, but my best guess has to be many OOMs off for superforecasters prediction to be in the right OOM. This makes me believe superforecasters’ are overestimating nuclear extinction risk.
Yes, in the same way that the risk of global warming is often overestimated due to neglecting adaptation.
I expect the defensive side to be under-counted, but not necessarily due to lack of quantitative models. However, I think using quantitative models makes it less likely that the defensive side is under-counted. I have not thought much about this; I am just expressing my intuitions.