Another example is the difficulty of comparing biorisk and AI risk without engaging in potentially infohazardous concrete threat models. While both are considered core cause areas of longtermism, it is challenging to determine how to prioritise these risks without evaluating the likelihood of a catastrophic event.
I would go further, and say that it is challenging to determine whether biorisk should be one of the core areas of longtermism. FWIW, the superforecasters and domain experts of The Existential Risk Persuasion Tournament (XPT) predicted the extinction risk until 2100 from engineered pathogens to be 13.5 % (= 0.01/0.074) and 1.82 times (= 0.01/0.0055) that of nuclear war. This is seemingly in contrast with nuclear not being a core area of longtermis (unlike AI and bio).
I personally think both superforecasters and domain experts are greatly overestimating nuclear extinction risk (I guess it is more like 10^-6 in the next 100 years). However, I find it plausible that extinction risk from engineered pathogens is also much lower than the 3 % bio existential risk from 2021 to 2120 guessed by Toby Ord in The Precipice. David Thorstad will be exploring this in a series (the 1st post is already out).
predicted the extinction risk until 2100 from engineered pathogens to be 13.5 % (= 0.01/0.074) and 1.82 times (= 0.01/0.0055) that of nuclear war. This is seemingly in contrast with nuclear not being a core area of longtermism (unlike AI and bio).
My impression was that nuclear risk has usually ended up as a somewhat lower priority for EAs because it’s less neglected?
According to 80,000 Hours’ profiles on nuclear war and catastrophic pandemics, it looks like scale, neglectedness and solvability play similar roles:
The scale of nuclear war might be 10 % that of catastrophic pandemics:
“We think the direct existential risk from nuclear war (i.e. not including secondary effects) is less than 0.01%. The indirect existential risk seems around 10 times higher”. So existential nuclear risk is less than 0.1 %, which might be interpreted as 0.01 %?
“Overall, we think the risk [from “existential biological catastrophe”] is around 0.1%, and very likely to be greater than 0.01%, but we haven’t thought about this in detail”.
Catastrophic pandemics might be 3 times as neglected as nuclear war:
“This issue is not as neglected as most other issues we prioritise. Current spending is between $1 billion and $10 billion per year (quality-adjusted).1” So maybe 3 billion (geometric mean)?
It sounds like they think reducing the risk from catastrophic pandemics is more tractable:
“Making progress on nuclear security seems somewhat tractable. While many routes to progress face significant political controversy, there may also be some more neglected ways to reduce this risk.2”
“There are promising existing approaches to improving biosecurity, including both developing technology that could reduce these risks (e.g. better bio-surveillance), and working on strategy and policy to develop plans to prevent and mitigate biological catastrophes.”
So you may be right that the level of risk is not a major driver for nuclear war not being a core area. However, I guess other organisations believe the bio existential risk to be higher than 80,000 Hours, whereas few will have higher estimates for nuclear existential risk.
Thanks for posting this, Nadia!
I would go further, and say that it is challenging to determine whether biorisk should be one of the core areas of longtermism. FWIW, the superforecasters and domain experts of The Existential Risk Persuasion Tournament (XPT) predicted the extinction risk until 2100 from engineered pathogens to be 13.5 % (= 0.01/0.074) and 1.82 times (= 0.01/0.0055) that of nuclear war. This is seemingly in contrast with nuclear not being a core area of longtermis (unlike AI and bio).
I personally think both superforecasters and domain experts are greatly overestimating nuclear extinction risk (I guess it is more like 10^-6 in the next 100 years). However, I find it plausible that extinction risk from engineered pathogens is also much lower than the 3 % bio existential risk from 2021 to 2120 guessed by Toby Ord in The Precipice. David Thorstad will be exploring this in a series (the 1st post is already out).
My impression was that nuclear risk has usually ended up as a somewhat lower priority for EAs because it’s less neglected?
Thanks for asking, Jeff!
According to 80,000 Hours’ profiles on nuclear war and catastrophic pandemics, it looks like scale, neglectedness and solvability play similar roles:
The scale of nuclear war might be 10 % that of catastrophic pandemics:
“We think the direct existential risk from nuclear war (i.e. not including secondary effects) is less than 0.01%. The indirect existential risk seems around 10 times higher”. So existential nuclear risk is less than 0.1 %, which might be interpreted as 0.01 %?
“Overall, we think the risk [from “existential biological catastrophe”] is around 0.1%, and very likely to be greater than 0.01%, but we haven’t thought about this in detail”.
Catastrophic pandemics might be 3 times as neglected as nuclear war:
“This issue is not as neglected as most other issues we prioritise. Current spending is between $1 billion and $10 billion per year (quality-adjusted).1” So maybe 3 billion (geometric mean)?
“As a result, our quality-adjusted estimate suggests that current spending is around $1 billion per year. (For comparison with other significant risks, we estimate that hundreds of billions per year are spent on climate change, while tens of millions are spent on reducing risks from AI.)”
It sounds like they think reducing the risk from catastrophic pandemics is more tractable:
“Making progress on nuclear security seems somewhat tractable. While many routes to progress face significant political controversy, there may also be some more neglected ways to reduce this risk.2”
“There are promising existing approaches to improving biosecurity, including both developing technology that could reduce these risks (e.g. better bio-surveillance), and working on strategy and policy to develop plans to prevent and mitigate biological catastrophes.”
So you may be right that the level of risk is not a major driver for nuclear war not being a core area. However, I guess other organisations believe the bio existential risk to be higher than 80,000 Hours, whereas few will have higher estimates for nuclear existential risk.