Out of the 3 scenarios, the scenarios of stable authoritarianism and slow takeoff sound like they won’t lead to the extinction of humanity. Would you say that humanity is less likely to become extinct because of AI, and more likely to just be in a dystopian scenario instead?
Out of the 3 scenarios, the scenarios of stable authoritarianism and slow takeoff sound like they won’t lead to the extinction of humanity. Would you say that humanity is less likely to become extinct because of AI, and more likely to just be in a dystopian scenario instead?