Some reasons I can imagine for focusing on 90+% loss scenarios:
You might just have the empirical view that very few things would cause ‘medium-sized’ losses of a lot of the future’s value. It could then be useful to define ‘existential risk’ to exclude medium-sized losses, so that when you talk about ‘x-risks’ people fully appreciate just how bad you think these outcomes would be.
‘Existential’ suggests a threat to the ‘existence’ of humanity, i.e., an outcome about as bad as human extinction. (Certainly a lot of EAs—myself included, when I first joined the community! -- misunderstand x-risk and think it’s equivalent to extinction risk.)
After googling a bit, I now think Nick Bostrom’s conception of existential risk (at least as of 2012) is similar to Toby’s. In https://www.existential-risk.org/concept.html, Nick divides up x-risks into the categories ”human extinction, permanent stagnation, flawed realization, and subsequent ruination”, and says that in a “flawed realization”, “humanity reaches technological maturity” but “the amount of value realized is but a small fraction of what could have been achieved”. This only makes sense as a partition of x-risks if all x-risks reduce value to “a small fraction of what could have been achieved” (or reduce the future’s value to zero).
I still think that the definition of x-risk I proposed is a bit more useful, and I think it’s a more natural interpretation of phrasings like “drastically curtail [Earth-originating intelligent life’s] potential” and “reduce its quality of life (compared to what would otherwise have been possible) permanently and drastically”. Perhaps I should use a new term, like hyperastronomical catastrophe, when I want to refer to something like ‘catastrophes that would reduce the total value of the future by 5% or more’.
On the final paragraph, I don’t strongly disagree, but:
I think to me “drastically curtail” more naturally means “reduces to much less than 50%” (though that may be biased by me having also heard Ord’s operationalisation for the same term).
At first glance, I feel averse to introducing a new term for something like “reduces by 5-90%”
I think “non-existential trajectory change”, or just “trajectory change”, maybe does an ok job for what you want to say
Technically those things would also cover 0.0001% losses or the like. But it seems like you could just say “trajectory change” and then also talk about roughly how much loss you mean?
It seems like if we come up with a new term for the 5-90% bucket, we would also want a new term for other buckets?
Some reasons I can imagine for focusing on 90+% loss scenarios:
You might just have the empirical view that very few things would cause ‘medium-sized’ losses of a lot of the future’s value. It could then be useful to define ‘existential risk’ to exclude medium-sized losses, so that when you talk about ‘x-risks’ people fully appreciate just how bad you think these outcomes would be.
‘Existential’ suggests a threat to the ‘existence’ of humanity, i.e., an outcome about as bad as human extinction. (Certainly a lot of EAs—myself included, when I first joined the community! -- misunderstand x-risk and think it’s equivalent to extinction risk.)
After googling a bit, I now think Nick Bostrom’s conception of existential risk (at least as of 2012) is similar to Toby’s. In https://www.existential-risk.org/concept.html, Nick divides up x-risks into the categories ”human extinction, permanent stagnation, flawed realization, and subsequent ruination”, and says that in a “flawed realization”, “humanity reaches technological maturity” but “the amount of value realized is but a small fraction of what could have been achieved”. This only makes sense as a partition of x-risks if all x-risks reduce value to “a small fraction of what could have been achieved” (or reduce the future’s value to zero).
I still think that the definition of x-risk I proposed is a bit more useful, and I think it’s a more natural interpretation of phrasings like “drastically curtail [Earth-originating intelligent life’s] potential” and “reduce its quality of life (compared to what would otherwise have been possible) permanently and drastically”. Perhaps I should use a new term, like hyperastronomical catastrophe, when I want to refer to something like ‘catastrophes that would reduce the total value of the future by 5% or more’.
I agree with everything but your final paragraph.
On the final paragraph, I don’t strongly disagree, but:
I think to me “drastically curtail” more naturally means “reduces to much less than 50%” (though that may be biased by me having also heard Ord’s operationalisation for the same term).
At first glance, I feel averse to introducing a new term for something like “reduces by 5-90%”
I think “non-existential trajectory change”, or just “trajectory change”, maybe does an ok job for what you want to say
Technically those things would also cover 0.0001% losses or the like. But it seems like you could just say “trajectory change” and then also talk about roughly how much loss you mean?
It seems like if we come up with a new term for the 5-90% bucket, we would also want a new term for other buckets?