It’s important to distinguish existential risk (x-risk) from global catastrophic risk (GCR). Nuclear war and extreme climate change, for example, are much more likely to have survivors, so are mostly GCRs rather than x-risks. Similarly with engineered pandemics—it seems like they are more likely to be survivable by some fraction of humanity, down to the relatively slow speed of spread, and the possibility of countermeasures (you are only up against human level intelligence), compared to an unaligned AGI (you are up against superintelligence which could wipe out the human race in minutes).
It’s important to distinguish existential risk (x-risk) from global catastrophic risk (GCR). Nuclear war and extreme climate change, for example, are much more likely to have survivors, so are mostly GCRs rather than x-risks. Similarly with engineered pandemics—it seems like they are more likely to be survivable by some fraction of humanity, down to the relatively slow speed of spread, and the possibility of countermeasures (you are only up against human level intelligence), compared to an unaligned AGI (you are up against superintelligence which could wipe out the human race in minutes).