I think most efforts to mitigate catastrophic risks also save lives by predictably mitigating subcatastrophic risks, and from a neartermist perspective, this significantly increases expected value. I’m most confident that this justifies prioritising biorisk and climate change, but less so with regards to nuclear war and AI.
Similar investments to combat biorisk will have predictably similar effects on endemic infectious diseases and future pandemics which aren’t quite existential risks.
I think most efforts to mitigate catastrophic risks also save lives by predictably mitigating subcatastrophic risks, and from a neartermist perspective, this significantly increases expected value. I’m most confident that this justifies prioritising biorisk and climate change, but less so with regards to nuclear war and AI.
For example, vaccine platform development, which is probably a key investment to mitigate biorisk, helped us fight COVID (which shouldn’t be classed as an existential risk), and COVID vaccines are estimated to have saved 20 million lives (https://www.imperial.ac.uk/news/237591/vaccinations-have-prevented-almost-20-million/).
Similar investments to combat biorisk will have predictably similar effects on endemic infectious diseases and future pandemics which aren’t quite existential risks.