Other commenters are arguing that next time things might be different, due to the nature of technological risks like AI. I agree, but I think there’s an even simpler reason to focus attention on rapid-extinction scenarios: we don’t have as much time to prevent them!
If we were equally worried about extinction due to AI, versus extinction due to slow economic stagnation / declining birthrates / political decay / etc, we might still want to put most of our effort into solving AI. As they say, “there’s a lot of ruin in an empire”—if human civilization was on track to dwindle away over centuries, that also means we’d have centuries to try and turn things around.
Other commenters are arguing that next time things might be different, due to the nature of technological risks like AI. I agree, but I think there’s an even simpler reason to focus attention on rapid-extinction scenarios: we don’t have as much time to prevent them!
If we were equally worried about extinction due to AI, versus extinction due to slow economic stagnation / declining birthrates / political decay / etc, we might still want to put most of our effort into solving AI. As they say, “there’s a lot of ruin in an empire”—if human civilization was on track to dwindle away over centuries, that also means we’d have centuries to try and turn things around.
This is a good point, thanks!