You’re right, it looks like most written texts, especially more formal ones give definitions where x-risks are equal or a strict subset. We should probably just try to roll that out to informal discussions and operationalisations too.
“Definition: Global Catastrophic Risk – risk of events or processes that would lead to the deaths of approximately a tenth of the world’s population, or have a comparable impact.” GCR Report
“A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale.” -
Wiki
“Global catastrophic risk (GCR) is the risk of events large enough to significantly harm or even destroy human civilization at the global scale.” GCRI
“These represent global catastrophic risks—events that might kill a tenth of the world’s population.”—HuffPo
You’re right, it looks like most written texts, especially more formal ones give definitions where x-risks are equal or a strict subset. We should probably just try to roll that out to informal discussions and operationalisations too.
“Definition: Global Catastrophic Risk – risk of events or processes that would lead to the deaths of approximately a tenth of the world’s population, or have a comparable impact.” GCR Report
“A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale.” - Wiki
“Global catastrophic risk (GCR) is the risk of events large enough to significantly harm or even destroy human civilization at the global scale.” GCRI
“These represent global catastrophic risks—events that might kill a tenth of the world’s population.”—HuffPo