My argument does say something about how nuclear risk shoud be prioritised. It is a lower priority if both existed. Maybe much lower.
The complicated thing is that nuclear risks do exist whereas biorisk and AI risk are much more speculative in terms of actually existing. In this sense I can believe nuclear should be funded more.
I think your arguments do suggest good reasons why nuclear risk might be prioritized lower; since we operate on the most effective margin, as you note, it is also possible at the same time for there to be significant funding margins in nuclear that are highly effective in expectation.
I’m not Matt, but I do work on nuclear risk. If we went down to 1000 to 10,000 people, recovery would take a long time, so there is significant chance of supervolcanic eruption or asteroid/comet impact causing extinction. People note that agriculture/cities developed independently, indicating it is high probability. However, it only happened when we had a stable moderate climate, which might not recur. Furthermore, the Industrial Revolution only happened once, so there is less confidence that it would happen again. In addition, it would be more difficult with depleted fossil fuels, phosphorus, etc. Even if we did recover industry, I think our current values are better than randomly chosen values (e.g. slavery might continue longer or democracy be less prevalent).
This feels too confident. A nuclear war into a supervolcano is just really unlikely. Plus if there were 1000 people then there would be so much human canned goods left over—just go to a major city and sit in a supermarket.
If a major city can support a million people for 3 days on its reserves it can support a 1000 people for 30 years.
Again, I’m not saying that I think it doesn’t matter, but I think my answers are good reasons why it’s less than AI
A nuclear war into a supervolcano is just really unlikely.
A nuclear war happening at the same time as a supervolcano is very unlikely. However, it could take a hundred thousand years to recover population, so if the frequency of supervolcanic eruptions is roughly every 30,000 years, it’s quite likely there would be one before we recover.
Plus if there were 1000 people then there would be so much human canned goods left over—just go to a major city and sit in a supermarket.
The scenario I’m talking about is one where the worsening climate and loss of technology means they would not be enough food, so the stored food would be consumed quickly. Furthermore, edible wild species including fish may be eaten to extinction.
Again, I’m not saying that I think it doesn’t matter, but I think my answers are good reasons why it’s less than AI
My argument does say something about how nuclear risk shoud be prioritised. It is a lower priority if both existed. Maybe much lower.
The complicated thing is that nuclear risks do exist whereas biorisk and AI risk are much more speculative in terms of actually existing. In this sense I can believe nuclear should be funded more.
I think your arguments do suggest good reasons why nuclear risk might be prioritized lower; since we operate on the most effective margin, as you note, it is also possible at the same time for there to be significant funding margins in nuclear that are highly effective in expectation.
Do you work on researching nuclear risk?
How do you think this disagreement could be more usefully delineated. It seems like there is some interesting disagreement here?
I’m not Matt, but I do work on nuclear risk. If we went down to 1000 to 10,000 people, recovery would take a long time, so there is significant chance of supervolcanic eruption or asteroid/comet impact causing extinction. People note that agriculture/cities developed independently, indicating it is high probability. However, it only happened when we had a stable moderate climate, which might not recur. Furthermore, the Industrial Revolution only happened once, so there is less confidence that it would happen again. In addition, it would be more difficult with depleted fossil fuels, phosphorus, etc. Even if we did recover industry, I think our current values are better than randomly chosen values (e.g. slavery might continue longer or democracy be less prevalent).
This feels too confident. A nuclear war into a supervolcano is just really unlikely. Plus if there were 1000 people then there would be so much human canned goods left over—just go to a major city and sit in a supermarket.
If a major city can support a million people for 3 days on its reserves it can support a 1000 people for 30 years.
Again, I’m not saying that I think it doesn’t matter, but I think my answers are good reasons why it’s less than AI
A nuclear war happening at the same time as a supervolcano is very unlikely. However, it could take a hundred thousand years to recover population, so if the frequency of supervolcanic eruptions is roughly every 30,000 years, it’s quite likely there would be one before we recover.
The scenario I’m talking about is one where the worsening climate and loss of technology means they would not be enough food, so the stored food would be consumed quickly. Furthermore, edible wild species including fish may be eaten to extinction.
I agree that more total money should be spent on AGI safety than nuclear issues. However, resilience to sunlight reduction is much more neglected than AGI safety. That’s why the Monte Carlo analyses found that the cost-effectiveness of resilience to loss of electricity (e.g. high-altitude detonations of nuclear weapons causing electromagnetic pulses) and resilience to nuclear winter are competitive with AGI safety.