For completeness, you might want to examine counterfactuals / challenges, or suggestions that:
# AI+nuclear weapons, and nuclear war itself are more immediate neglected risks; the predicted arrival date of AGI tends to get put back 5 years, every 5 years or so, and a nuclear war might push it back further:
# it could be that the population selection for IT-types within academia and EA leads to AGI being over-emphasised as a GCR within EA and academia; also, it’s a really interesting and absorbing topic, so who wouldn’t want to prioritise it?
# just because it’s a high priority, doesn’t mean everyone should be doing it!
# more EAs should go into defence, RAND and intelligence agencies, so that at least a few EAs know what is going on in there, and it isn’t dominated by hawks
For completeness, you might want to examine counterfactuals / challenges, or suggestions that:
# AI+nuclear weapons, and nuclear war itself are more immediate neglected risks; the predicted arrival date of AGI tends to get put back 5 years, every 5 years or so, and a nuclear war might push it back further:
# it could be that the population selection for IT-types within academia and EA leads to AGI being over-emphasised as a GCR within EA and academia; also, it’s a really interesting and absorbing topic, so who wouldn’t want to prioritise it?
# just because it’s a high priority, doesn’t mean everyone should be doing it!
# more EAs should go into defence, RAND and intelligence agencies, so that at least a few EAs know what is going on in there, and it isn’t dominated by hawks