A possible story on how the value of a longtermist’s life might be higher in a post-London-gets-nuked world than in today’s world (from my comment replying to Ben Todd’s comment on this Google Doc):
--------
I think what we actually care about is value of a life if London gets nuked relative to if it doesn’t rather than quality-adjusted life expectancy.
This might vary a lot depending on the person. E.g. For a typical person, life after London gets nuked is probably worth significantly less (as you say), but for a longtermist altruist it seems conceivable that life is actually worth more after a nuclear war. I’m not confident that’s the case in expectation (more research is needed), but here’s a possible story:
Perhaps after a Russia-US nuclear war that leaves London in ruins, existential risk this century is higher because China is more likely to create AGI than the West (relative to the world in which nuclear war didn’t occur) and because it’s true that China is less likely to solve AI alignment than the West. The marginal western longtermist might make more of a difference in expectation in the post-war world than in the world without war due to (1) the absolute existential risk being higher in the post-war world and (2) there being fewer qualified people alive in the post-war world who could meaningfully affect the development of AGI.
If the longtermist indeed makes more of a difference to raising the probability of a very long-lasting and positive future in the post-war world than in the normal-low-risk-of-nuclear-war world, then the value of their life is higher in the post-war world, and so it might make sense to use >50 years of life left for this highlighted estimate. Or alternatively, saving 7 hours of life expectancy in a post-war world might be more like saving 14 hours of life in a world with normal low nuclear risk (if the longtermist’s life is twice as valuable in the post-war world).
A possible story on how the value of a longtermist’s life might be higher in a post-London-gets-nuked world than in today’s world (from my comment replying to Ben Todd’s comment on this Google Doc):
--------
I think what we actually care about is value of a life if London gets nuked relative to if it doesn’t rather than quality-adjusted life expectancy.
This might vary a lot depending on the person. E.g. For a typical person, life after London gets nuked is probably worth significantly less (as you say), but for a longtermist altruist it seems conceivable that life is actually worth more after a nuclear war. I’m not confident that’s the case in expectation (more research is needed), but here’s a possible story:
Perhaps after a Russia-US nuclear war that leaves London in ruins, existential risk this century is higher because China is more likely to create AGI than the West (relative to the world in which nuclear war didn’t occur) and because it’s true that China is less likely to solve AI alignment than the West. The marginal western longtermist might make more of a difference in expectation in the post-war world than in the world without war due to (1) the absolute existential risk being higher in the post-war world and (2) there being fewer qualified people alive in the post-war world who could meaningfully affect the development of AGI.
If the longtermist indeed makes more of a difference to raising the probability of a very long-lasting and positive future in the post-war world than in the normal-low-risk-of-nuclear-war world, then the value of their life is higher in the post-war world, and so it might make sense to use >50 years of life left for this highlighted estimate. Or alternatively, saving 7 hours of life expectancy in a post-war world might be more like saving 14 hours of life in a world with normal low nuclear risk (if the longtermist’s life is twice as valuable in the post-war world).