And it’s framed as long-run future because we think that there are potentially lots of things that could have a huge positive on the value of the long-run future which aren’t GCRs—like humanity having the right values, for example.
I don’t have much to add to what Rob W and Carl said, but I’ll note that Bostrom defined “existential risk” like this back in 2008:
A subset of global catastrophic risks is existential risks. An existential risk is one that threatens to cause the extinction of Earth-originating intelligent life or to reduce its quality of life (compared to what would otherwise have been possible) permanently and drastically.
Presumably we should replace “intelligent” here with “sentient” or similar. The reason I’m quoting this is that on the above definition, it sounds like any potential future event or process that would cost us a large portion of the future’s value counts as an xrisk (and therefore as a GCR). ‘Humanity’s moral progress stagnates or we otherwise end up with the wrong values’ sounds like a global catastrophic risk to me, on that definition. (From a perspective that does care about long-term issues, at least.)
I’ll note that I think there’s at least some disagreement at FHI / Open Phil / etc. about how best to define terms like “GCR”, and I don’t know if there’s currently a consensus or what that consensus is. Also worth noting that the “risk” part is more clearly relevant than the “global catastrophe” part—malaria and factory farming are arguably global catastrophes in Bostrom’s sense, but they aren’t “risks” in the relevant sense, because they’re already occurring.
My understanding: GCR = (something like) risk of major catastrophe that kills 100mn+ people
(I think the GCR book defines it as risk of 10mn+ deaths, but that seemed too low to me).
So, as I was using the term, something being an x-risk does not entail it being a GCR. I’d count ‘Humanity’s moral progress stagnates or we otherwise end up with the wrong values’ as an x-risk but not a GCR.
Interesting (/worrying!) how we’re understanding widely-used terms so differently.
Agree that that’s the most common operationalization of a GCR. It’s a bit inelegant for GCR not to include all x-risks though, especially given that it is used interchangeably within EA.
It would odd if the onset of a permanently miserable dictatorship didn’t count as a global catastrophe because no lives were lost.
Could you or Will provide an example of a source that explicitly uses “GCR” and “xrisk” in such a way that there are non-GCR xrisks? You say this is the most common operationalization, but I’m only finding examples that treat xrisk as a subset of GCR, as the Bostrom quote above does.
You’re right, it looks like most written texts, especially more formal ones give definitions where x-risks are equal or a strict subset. We should probably just try to roll that out to informal discussions and operationalisations too.
“Definition: Global Catastrophic Risk – risk of events or processes that would lead to the deaths of approximately a tenth of the world’s population, or have a comparable impact.” GCR Report
“A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale.” -
Wiki
“Global catastrophic risk (GCR) is the risk of events large enough to significantly harm or even destroy human civilization at the global scale.” GCRI
“These represent global catastrophic risks—events that might kill a tenth of the world’s population.”—HuffPo
I don’t have much to add to what Rob W and Carl said, but I’ll note that Bostrom defined “existential risk” like this back in 2008:
Presumably we should replace “intelligent” here with “sentient” or similar. The reason I’m quoting this is that on the above definition, it sounds like any potential future event or process that would cost us a large portion of the future’s value counts as an xrisk (and therefore as a GCR). ‘Humanity’s moral progress stagnates or we otherwise end up with the wrong values’ sounds like a global catastrophic risk to me, on that definition. (From a perspective that does care about long-term issues, at least.)
I’ll note that I think there’s at least some disagreement at FHI / Open Phil / etc. about how best to define terms like “GCR”, and I don’t know if there’s currently a consensus or what that consensus is. Also worth noting that the “risk” part is more clearly relevant than the “global catastrophe” part—malaria and factory farming are arguably global catastrophes in Bostrom’s sense, but they aren’t “risks” in the relevant sense, because they’re already occurring.
“counts as an xrisk (and therefore as a GCR)”
My understanding: GCR = (something like) risk of major catastrophe that kills 100mn+ people
(I think the GCR book defines it as risk of 10mn+ deaths, but that seemed too low to me).
So, as I was using the term, something being an x-risk does not entail it being a GCR. I’d count ‘Humanity’s moral progress stagnates or we otherwise end up with the wrong values’ as an x-risk but not a GCR.
Interesting (/worrying!) how we’re understanding widely-used terms so differently.
Agree that that’s the most common operationalization of a GCR. It’s a bit inelegant for GCR not to include all x-risks though, especially given that it is used interchangeably within EA.
It would odd if the onset of a permanently miserable dictatorship didn’t count as a global catastrophe because no lives were lost.
Could you or Will provide an example of a source that explicitly uses “GCR” and “xrisk” in such a way that there are non-GCR xrisks? You say this is the most common operationalization, but I’m only finding examples that treat xrisk as a subset of GCR, as the Bostrom quote above does.
You’re right, it looks like most written texts, especially more formal ones give definitions where x-risks are equal or a strict subset. We should probably just try to roll that out to informal discussions and operationalisations too.
“Definition: Global Catastrophic Risk – risk of events or processes that would lead to the deaths of approximately a tenth of the world’s population, or have a comparable impact.” GCR Report
“A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale.” - Wiki
“Global catastrophic risk (GCR) is the risk of events large enough to significantly harm or even destroy human civilization at the global scale.” GCRI
“These represent global catastrophic risks—events that might kill a tenth of the world’s population.”—HuffPo