Instead of framing priorities this way, I believe it would be valuable for more people to adopt a mindset that assumes transformative AI is likely coming and asks: What should we work on in light of that?
For climate change, I think it means focusing on the catastrophes that could plausibly happen in the next couple decades, such as coincident extreme weather on multiple continents or the collapse of the sub polar gyre. So adaptation becomes relatively more important than emissions reductions. Since AI probably makes nuclear conflict and engineered pandemic more likely, the importance of these fields may still be similar, but you would likely move away from long payoff actions like field building and maybe some things that are likely to take a long time, such as major arsenal reductions or wide adoption of germicidal UV or enhanced air filtration. Instead, one might focus on trying to reduce the chance of nuclear war, especially given AI-enabled systems or AI-induced tensions, or increasing resilience to nuclear war. On the pandemic side, of course reducing the risk that AI enables pandemics, but also near-term actions that could have a significant impact like early warning for pandemics or rapid scale up of resilience post outbreak.
For climate change, I think it means focusing on the catastrophes that could plausibly happen in the next couple decades, such as coincident extreme weather on multiple continents or the collapse of the sub polar gyre. So adaptation becomes relatively more important than emissions reductions. Since AI probably makes nuclear conflict and engineered pandemic more likely, the importance of these fields may still be similar, but you would likely move away from long payoff actions like field building and maybe some things that are likely to take a long time, such as major arsenal reductions or wide adoption of germicidal UV or enhanced air filtration. Instead, one might focus on trying to reduce the chance of nuclear war, especially given AI-enabled systems or AI-induced tensions, or increasing resilience to nuclear war. On the pandemic side, of course reducing the risk that AI enables pandemics, but also near-term actions that could have a significant impact like early warning for pandemics or rapid scale up of resilience post outbreak.