There are some places where you seem to use the terms âexistential riskâ and âextinction riskâ as interchangeable. For example, you write:
I do not think it is obvious that reducing the probability of extinction does more good per dollar than the value drift rate, which naively suggests the effective altruist community should invest relatively more into reducing value drift. But I find it plausible that, upon further analysis, it would become clear that existential risk matters much more.
Additionally, it seems that, to get your âannual extinction probabilityâ estimate, some of the estimates you use from the spreadsheet I put together are actually existential risk, global catastrophic risk, or collapse risk. For example, you seem to use Ordâs estimate of total existential risk, Reesâ estimate of the odds that our present civilizationon earth will survive to the end of the present century, and Simpsonâs estimate that âHumanityâs prognosis for the coming century is well approximated by a global catastrophic risk of 0.2% per yearâ (emphases added).
But, as both Bostrom and Ord make clear in their writings on existential risk, extinction is not the only possible type of existential catastrophe. There could also be an unrecoverable collapse or an unrecoverable dystopia. And many global catastrophes would not be existential catastrophes.
I see this as important because:
Overlooking that there are possible types of existential catastrophe other than extinction might lead to us doing too little to protect against them.
Relatedly, using the term âexistential riskâ when one really means âextinction riskâ might make existential risk less effective as jargon that can efficiently convey this key thing many EAs care about.
Existential risk and global catastrophic risk are both very likely at least a bit higher than extinction risk (since they include a large number of possible events). And Iâd guess collapse risk might be higher as well. So you may end up with an overly high extinction risk estimate in your discount rate.
Alternatively, if existential risk is actually the most appropriate thing to include in your discount rate (rather than extinction risk), using estimates of extinction risk alone may lead your discount rate being too low. This is because extinction risk estimates overlook the risk of unrecoverable collapse or dystopia.
To be clear, I have no problems with sources that just talk about extinction risk. Often, thatâs the appropriate scope for a given piece of work. I just have a pet peeve with people really talking about extinction risk, but using the term existential risk, or vice versa.
Also to be clear, youâre far from the only person whoâs done that, and this isnât really a criticism of the substance of the post (though it may suggest that the estimates should be tweaked somewhat).
Existential risk â extinction risk â global catastrophic risk
For an expanded version of the following points, see Clarifying existential risks and existential catastrophes and/âor 3 suggestions about jargon in EA.
There are some places where you seem to use the terms âexistential riskâ and âextinction riskâ as interchangeable. For example, you write:
Additionally, it seems that, to get your âannual extinction probabilityâ estimate, some of the estimates you use from the spreadsheet I put together are actually existential risk, global catastrophic risk, or collapse risk. For example, you seem to use Ordâs estimate of total existential risk, Reesâ estimate of the odds that our present civilization on earth will survive to the end of the present century, and Simpsonâs estimate that âHumanityâs prognosis for the coming century is well approximated by a global catastrophic risk of 0.2% per yearâ (emphases added).
But, as both Bostrom and Ord make clear in their writings on existential risk, extinction is not the only possible type of existential catastrophe. There could also be an unrecoverable collapse or an unrecoverable dystopia. And many global catastrophes would not be existential catastrophes.
I see this as important because:
Overlooking that there are possible types of existential catastrophe other than extinction might lead to us doing too little to protect against them.
Relatedly, using the term âexistential riskâ when one really means âextinction riskâ might make existential risk less effective as jargon that can efficiently convey this key thing many EAs care about.
Existential risk and global catastrophic risk are both very likely at least a bit higher than extinction risk (since they include a large number of possible events). And Iâd guess collapse risk might be higher as well. So you may end up with an overly high extinction risk estimate in your discount rate.
Alternatively, if existential risk is actually the most appropriate thing to include in your discount rate (rather than extinction risk), using estimates of extinction risk alone may lead your discount rate being too low. This is because extinction risk estimates overlook the risk of unrecoverable collapse or dystopia.
To be clear, I have no problems with sources that just talk about extinction risk. Often, thatâs the appropriate scope for a given piece of work. I just have a pet peeve with people really talking about extinction risk, but using the term existential risk, or vice versa.
Also to be clear, youâre far from the only person whoâs done that, and this isnât really a criticism of the substance of the post (though it may suggest that the estimates should be tweaked somewhat).