I agree that there should be more focus on resilience (thanks for mentioning ALLFED), and I also agree that we need to consider scenarios where leaders do not respond rationally. You may be aware of Toby Ordās discussion of existential risk factors in the Precipice, where he roughly estimates a great power war might increase the total existential risk by 10% (page 176). You say:
What is the multiplying impact factor of climate change on x-risks ā compared to a world without climate change?
If forced to guess, considering the effects of climate change, I believe a multiplying factor of at least an order of magnitude is conservative. However, further calculations and estimates are absolutely required to verify this.
So youāre saying the impact of climate change is ~90 times as much as his estimate of the impact of great power war (900% increase versus 10% increase in X risk). I think part of the issue is that you believe the world with climate change is significantly worse than the world is now. We agree that the world with climate change is worse than the business as usual, but to claim it is worse than now means that climate change would overwhelm all the economic growth that would have occurred in the next century or so. I think this is hard to defend for expected climate change. But this could be the case for the versions of climate change that ALLFED focuses on, such as the abrupt regional climate change, extreme weather including floods and droughts on multiple continents at the same time causing around a 10% abrupt food production shortfall, or the extreme global climate change of around 6Ā°C or more. Still, I donāt think it is plausible to multiply existential risks such as unaligned AGI or engineered pandemic by 10 because of these climate catastrophes.
For some reason, when writing order of magnitude, I was thinking about existential risks that may have a 0.1% or 1% chance of happening being multiplied into the 1-10% range (e.g. nuclear war). However, I wasnāt considering many of the existential risks I was actually talking about (like biosafety, AI safety, etc) - itād be ridiculous for AI safety risk to be multiplied from 10% to 100%.
I think the estimate of a great power war increasing the total existential risk by 10% is much more fair than my estimate; because of this, in response to your feedback, Iāve modified my EA forum post to state that a total existential risk increase of 10% is a fair estimate given expected climate politics scenarios, citing Toby Ordās estimates of existential risk increase under global power conflict.
Thanks a ton for the thoughtful feedback! It is greatly appreciated.
I agree that there should be more focus on resilience (thanks for mentioning ALLFED), and I also agree that we need to consider scenarios where leaders do not respond rationally. You may be aware of Toby Ordās discussion of existential risk factors in the Precipice, where he roughly estimates a great power war might increase the total existential risk by 10% (page 176). You say:
So youāre saying the impact of climate change is ~90 times as much as his estimate of the impact of great power war (900% increase versus 10% increase in X risk). I think part of the issue is that you believe the world with climate change is significantly worse than the world is now. We agree that the world with climate change is worse than the business as usual, but to claim it is worse than now means that climate change would overwhelm all the economic growth that would have occurred in the next century or so. I think this is hard to defend for expected climate change. But this could be the case for the versions of climate change that ALLFED focuses on, such as the abrupt regional climate change, extreme weather including floods and droughts on multiple continents at the same time causing around a 10% abrupt food production shortfall, or the extreme global climate change of around 6Ā°C or more. Still, I donāt think it is plausible to multiply existential risks such as unaligned AGI or engineered pandemic by 10 because of these climate catastrophes.
This is very fair criticism and I agree.
For some reason, when writing order of magnitude, I was thinking about existential risks that may have a 0.1% or 1% chance of happening being multiplied into the 1-10% range (e.g. nuclear war). However, I wasnāt considering many of the existential risks I was actually talking about (like biosafety, AI safety, etc) - itād be ridiculous for AI safety risk to be multiplied from 10% to 100%.
I think the estimate of a great power war increasing the total existential risk by 10% is much more fair than my estimate; because of this, in response to your feedback, Iāve modified my EA forum post to state that a total existential risk increase of 10% is a fair estimate given expected climate politics scenarios, citing Toby Ordās estimates of existential risk increase under global power conflict.
Thanks a ton for the thoughtful feedback! It is greatly appreciated.