I feel like ” x-risk” is basically tautologically important and thus ceases to be a useful word in many cases. It’s like the longtermist equivalent of a neartermist saying “it would be good to solve everything really bad about the current world”.
I feel like ” x-risk” is basically tautologically important and thus ceases to be a useful word in many cases. It’s like the longtermist equivalent of a neartermist saying “it would be good to solve everything really bad about the current world”.