I think the answer depends on the timeframe you are asking over. I give some example timeframes you might want to ask the question over and plausible answers to the biggest x-risks.
1-3 year: nuclear war Reasoning: we are not close enough to building TAI that it will happen in the next few years. Nuclear war this year seems possible.
4-20 years: TAI Reasoning: Firstly you could say we are a bit closer to TAI than to building x-risk level viruses (very unsure about that). Secondly the TAI threat is most worrying in scenarios where it happens very quickly and we loose control (a fast risk) and the pandemic threat is most worrying in scenarios where we gradually get more and more ability to produce homebrew viruses (a slow risk).
21-50 years: TAI or manmade pandemics (unclear) Reasoning: As above TAI is less worrying if we have lots of time to work on alignment.
51-100 years: unknown unknown risks Reasoning: Imagine trying to predict the biggest x-risks today from 50 years ago. The world is changing too fast. There are so many technologies that could be transformative and potentially pose x-risk level threats. To think that the risks we think are biggest today will still be biggest in 50+ years is hubris.
I think as a community we could do more to map out the likelihood of different risks on different timeframes and to consider strategies for addressing unknown unknow risks
I think the answer depends on the timeframe you are asking over. I give some example timeframes you might want to ask the question over and plausible answers to the biggest x-risks.
1-3 year: nuclear war
Reasoning: we are not close enough to building TAI that it will happen in the next few years. Nuclear war this year seems possible.
4-20 years: TAI
Reasoning: Firstly you could say we are a bit closer to TAI than to building x-risk level viruses (very unsure about that). Secondly the TAI threat is most worrying in scenarios where it happens very quickly and we loose control (a fast risk) and the pandemic threat is most worrying in scenarios where we gradually get more and more ability to produce homebrew viruses (a slow risk).
21-50 years: TAI or manmade pandemics (unclear)
Reasoning: As above TAI is less worrying if we have lots of time to work on alignment.
51-100 years: unknown unknown risks
Reasoning: Imagine trying to predict the biggest x-risks today from 50 years ago. The world is changing too fast. There are so many technologies that could be transformative and potentially pose x-risk level threats. To think that the risks we think are biggest today will still be biggest in 50+ years is hubris.
I think as a community we could do more to map out the likelihood of different risks on different timeframes and to consider strategies for addressing unknown unknow risks