“Existential risk” has the advantage over “long-term future” and “far future” that it sounds like a technical term, so people are more likely to Google it if they haven’t encountered it (though admittedly this won’t fully address people who think they know what it means without actually knowing). In contrast, someone might just assume they know what “long-term future” and “far future” means, and if they do Google those terms they’ll have a harder time getting a relevant or consistent definition. Plus “long-term future” still has the problem that it suggests existential risk can’t be a near-term issue, even though some people working on existential risk are focusing on nearer-term scenarios than, e.g., some people working on factory farming abolition.
I think “global catastrophic risk” or “technological risk” would work fine for this purpose, though, and avoids the main concerns raised for both categories. (“Technological risk” also strikes me as a more informative / relevant / joint-carving category than the others considered, since x-risk and far future can overlap more with environmentalism, animal welfare, etc.)
Of course, I totally forgot about the “global catastrophic risk” term! I really like it and it doesn’t only suggest extinction risks. Even its acronym sounds pretty cool. I also really like your “technological risk” suggestion, Rob. Referring to GCR as “Long term future” is a pretty obvious branding tactic by those that prioritize GCRs. It is vague, misleading, and dishonest.
“Existential risk” has the advantage over “long-term future” and “far future” that it sounds like a technical term, so people are more likely to Google it if they haven’t encountered it (though admittedly this won’t fully address people who think they know what it means without actually knowing). In contrast, someone might just assume they know what “long-term future” and “far future” means, and if they do Google those terms they’ll have a harder time getting a relevant or consistent definition. Plus “long-term future” still has the problem that it suggests existential risk can’t be a near-term issue, even though some people working on existential risk are focusing on nearer-term scenarios than, e.g., some people working on factory farming abolition.
I think “global catastrophic risk” or “technological risk” would work fine for this purpose, though, and avoids the main concerns raised for both categories. (“Technological risk” also strikes me as a more informative / relevant / joint-carving category than the others considered, since x-risk and far future can overlap more with environmentalism, animal welfare, etc.)
Just a heads up “technological risks” ignores all the non-anthropogenic catastrophic risks. Global catastrophic risks seems good.
Of course, I totally forgot about the “global catastrophic risk” term! I really like it and it doesn’t only suggest extinction risks. Even its acronym sounds pretty cool. I also really like your “technological risk” suggestion, Rob. Referring to GCR as “Long term future” is a pretty obvious branding tactic by those that prioritize GCRs. It is vague, misleading, and dishonest.