Conditional on successfully preventing an extinction-level catastrophe, you should expect Flourishing to be (perhaps much) lower than otherwise, because a world that needs saving is more likely to be uncoordinated, poorly directed, or vulnerable in the long run
It isn’t enough to prevent a catastrophe to ensure survival. You need to permanently reduce x-risk to very low levels aka “existential security”. So the question isn’t how likely flourishing is after preventing a catastrophe, it’s how likely flourishing is after achieving existential security.
It seems to me flourishing is more likely after achieving existential security than it is after preventing an extinction-level catastrophe. Existential security should require a significant level of coordination, implying a world where we really got our shit together.
Of course there are counter-examples to that. We could achieve existential security through some 1984-type authoritarian system of mass surveillance, which could be a pretty bad world to live in.
So maybe the takeaway is the approach to achieving existential security matters. We should aim for safety but in a way that leaves things open. Much like the viatopia outcome Will MacAskill outlines.
It isn’t enough to prevent a catastrophe to ensure survival. You need to permanently reduce x-risk to very low levels aka “existential security”. So the question isn’t how likely flourishing is after preventing a catastrophe, it’s how likely flourishing is after achieving existential security.
It seems to me flourishing is more likely after achieving existential security than it is after preventing an extinction-level catastrophe. Existential security should require a significant level of coordination, implying a world where we really got our shit together.
Of course there are counter-examples to that. We could achieve existential security through some 1984-type authoritarian system of mass surveillance, which could be a pretty bad world to live in.
So maybe the takeaway is the approach to achieving existential security matters. We should aim for safety but in a way that leaves things open. Much like the viatopia outcome Will MacAskill outlines.