Yes! I’ve been thinking along similar lines recently. Although I have framed things a bit differently. Rather than being a top-level EA thing, I think that x-risk should be reinstated as a top level cause area it’s own right, separate to longtermism, and that longtermism gives the wrong impression of having a lot of time, when x-risk is an urgent short-term problem (more).
Also, I think ≥10% chance of AGI in ≤10 years should be regarded as “crunch time”, and the headlines for predictions/timelines should be the 10% estimate, not the 50% estimate, given the stakes (more).
Interesting point—I’ve also noticed that a lot of people misunderstand longtermism to mean ‘acting over very long timescales’ rather than ‘doing what’s best from a long-term perspective’.
I was considering writing a post making the point that for the majority of people, their personal risk of dying in an existential catastrophe in the next few decades is higher than all their other mortality risks, combined!
However, whilst I think this is probably true (and is a whole lot of food for thought!), it doesn’t necessarily follow that working on x-risk is the best way of increasing your life expectancy. Given that your personal share of finding solutions to x-risk will probably be quite small (maybe 1 part in 10^3-10^7), perhaps reducing your mortality by other means (lifestyle interventions to reduce other risks) would be easier. But then again, if you’re maxed out on all the low-hanging lifestyle interventions, maybe working on x-risk reduction is the way to go! :)
Yes! I’ve been thinking along similar lines recently. Although I have framed things a bit differently. Rather than being a top-level EA thing, I think that x-risk should be reinstated as a top level cause area it’s own right, separate to longtermism, and that longtermism gives the wrong impression of having a lot of time, when x-risk is an urgent short-term problem (more).
Also, I think ≥10% chance of AGI in ≤10 years should be regarded as “crunch time”, and the headlines for predictions/timelines should be the 10% estimate, not the 50% estimate, given the stakes (more).
Interesting point—I’ve also noticed that a lot of people misunderstand longtermism to mean ‘acting over very long timescales’ rather than ‘doing what’s best from a long-term perspective’.
I was considering writing a post making the point that for the majority of people, their personal risk of dying in an existential catastrophe in the next few decades is higher than all their other mortality risks, combined!
However, whilst I think this is probably true (and is a whole lot of food for thought!), it doesn’t necessarily follow that working on x-risk is the best way of increasing your life expectancy. Given that your personal share of finding solutions to x-risk will probably be quite small (maybe 1 part in 10^3-10^7), perhaps reducing your mortality by other means (lifestyle interventions to reduce other risks) would be easier. But then again, if you’re maxed out on all the low-hanging lifestyle interventions, maybe working on x-risk reduction is the way to go! :)