I mostly agree with this—our powers and coordination are beyond impressive when we wield them. So a extinction risk would have to explain why we can’t or don’t use all of our resources to stop our own demise. Potential examples: feedback loops that are selfishly beneficial and prevent coordination, even if its leading to a slow death overall. Instances where the collapse is slow but locked-in ahead of time. So even if we decide to move heaven and earth to do something about it, its too late.
I mostly agree with this—our powers and coordination are beyond impressive when we wield them. So a extinction risk would have to explain why we can’t or don’t use all of our resources to stop our own demise. Potential examples: feedback loops that are selfishly beneficial and prevent coordination, even if its leading to a slow death overall. Instances where the collapse is slow but locked-in ahead of time. So even if we decide to move heaven and earth to do something about it, its too late.