My understanding: GCR = (something like) risk of major catastrophe that kills 100mn+ people
(I think the GCR book defines it as risk of 10mn+ deaths, but that seemed too low to me).
So, as I was using the term, something being an x-risk does not entail it being a GCR. I’d count ‘Humanity’s moral progress stagnates or we otherwise end up with the wrong values’ as an x-risk but not a GCR.
Interesting (/worrying!) how we’re understanding widely-used terms so differently.
Agree that that’s the most common operationalization of a GCR. It’s a bit inelegant for GCR not to include all x-risks though, especially given that it is used interchangeably within EA.
It would odd if the onset of a permanently miserable dictatorship didn’t count as a global catastrophe because no lives were lost.
Could you or Will provide an example of a source that explicitly uses “GCR” and “xrisk” in such a way that there are non-GCR xrisks? You say this is the most common operationalization, but I’m only finding examples that treat xrisk as a subset of GCR, as the Bostrom quote above does.
You’re right, it looks like most written texts, especially more formal ones give definitions where x-risks are equal or a strict subset. We should probably just try to roll that out to informal discussions and operationalisations too.
“Definition: Global Catastrophic Risk – risk of events or processes that would lead to the deaths of approximately a tenth of the world’s population, or have a comparable impact.” GCR Report
“A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale.” -
Wiki
“Global catastrophic risk (GCR) is the risk of events large enough to significantly harm or even destroy human civilization at the global scale.” GCRI
“These represent global catastrophic risks—events that might kill a tenth of the world’s population.”—HuffPo
“counts as an xrisk (and therefore as a GCR)”
My understanding: GCR = (something like) risk of major catastrophe that kills 100mn+ people
(I think the GCR book defines it as risk of 10mn+ deaths, but that seemed too low to me).
So, as I was using the term, something being an x-risk does not entail it being a GCR. I’d count ‘Humanity’s moral progress stagnates or we otherwise end up with the wrong values’ as an x-risk but not a GCR.
Interesting (/worrying!) how we’re understanding widely-used terms so differently.
Agree that that’s the most common operationalization of a GCR. It’s a bit inelegant for GCR not to include all x-risks though, especially given that it is used interchangeably within EA.
It would odd if the onset of a permanently miserable dictatorship didn’t count as a global catastrophe because no lives were lost.
Could you or Will provide an example of a source that explicitly uses “GCR” and “xrisk” in such a way that there are non-GCR xrisks? You say this is the most common operationalization, but I’m only finding examples that treat xrisk as a subset of GCR, as the Bostrom quote above does.
You’re right, it looks like most written texts, especially more formal ones give definitions where x-risks are equal or a strict subset. We should probably just try to roll that out to informal discussions and operationalisations too.
“Definition: Global Catastrophic Risk – risk of events or processes that would lead to the deaths of approximately a tenth of the world’s population, or have a comparable impact.” GCR Report
“A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale.” - Wiki
“Global catastrophic risk (GCR) is the risk of events large enough to significantly harm or even destroy human civilization at the global scale.” GCRI
“These represent global catastrophic risks—events that might kill a tenth of the world’s population.”—HuffPo