I think it’s unfortunate how x-risks are usually lumped in with longtermism , and longtermism is talked about a lot more as a top-level EA cause area these days, and x-risk less so. This considering that, arguably, x-risk is very important from a short-term (or at least medium-term) perspective too.
As OP says in #2, according to our best estimates, many (most?) people’s chances of dying in a global catastrophe over the next 10-25 years are higher than many (any?) other regular causes of death (car accidents, infectious disease, heart disease and cancer for those of median age or younger, etc). As Carl Shulman outlines in his common sense case for x-risk work on the 80k podcast:
If you believe that the risk of human extinction over the next century is something like one in six (as Toby Ord suggests is a reasonable figure in his book The Precipice), then it would be worth the US government spending up to $2.2 trillion to reduce that risk by just 1%, in terms of [present] American lives saved alone.
See also Martin Trouilloud’s comment arguing that “a great thing you can do for the short term is to make the long term go well”, and this post arguing that “the ‘far future’ is not just the far future” [due to near term transformative technology and x-risk].
I think X-risk should be reinstated as a top-level EA cause area in it’s own right, distinct from Longtermism. I worry that having it as a sub-level concern, under the broad heading of longtermism, will lead to people seeing it as less urgent than it is; “longtermism” giving the impression that we have plenty of time to figure things out (when we really don’t, in expectation).
If you believe that the risk of human extinction over the next century is something like one in six (as Toby Ord suggests is a reasonable figure in his book The Precipice)
To be precise, Toby Ord’s figure of one in six in ″The Precipice″ refers to the chance of existential catastrophe, not human extinction. Existential catastrophe which includes events such as unrecoverable collapse.
This especially considering that an all-things-considered (and IMO conservative) estimate for the advent of AGI is 10% chance in (now) 14 years! This is a huge amount of short-term risk! It should not be considered as (exclusively) part of the longtermist cause area.
X-risk as a focus for neartermism
I think it’s unfortunate how x-risks are usually lumped in with longtermism , and longtermism is talked about a lot more as a top-level EA cause area these days, and x-risk less so. This considering that, arguably, x-risk is very important from a short-term (or at least medium-term) perspective too.
As OP says in #2, according to our best estimates, many (most?) people’s chances of dying in a global catastrophe over the next 10-25 years are higher than many (any?) other regular causes of death (car accidents, infectious disease, heart disease and cancer for those of median age or younger, etc). As Carl Shulman outlines in his common sense case for x-risk work on the 80k podcast:
See also Martin Trouilloud’s comment arguing that “a great thing you can do for the short term is to make the long term go well”, and this post arguing that “the ‘far future’ is not just the far future” [due to near term transformative technology and x-risk].
I think X-risk should be reinstated as a top-level EA cause area in it’s own right, distinct from Longtermism. I worry that having it as a sub-level concern, under the broad heading of longtermism, will lead to people seeing it as less urgent than it is; “longtermism” giving the impression that we have plenty of time to figure things out (when we really don’t, in expectation).
To be precise, Toby Ord’s figure of one in six in ″The Precipice″ refers to the chance of existential catastrophe, not human extinction. Existential catastrophe which includes events such as unrecoverable collapse.
Scott Alexander argues for the focus to be put (back) on x-risk here.
Great point!
This especially considering that an all-things-considered (and IMO conservative) estimate for the advent of AGI is 10% chance in (now) 14 years! This is a huge amount of short-term risk! It should not be considered as (exclusively) part of the longtermist cause area.