You can extend your argument to even smaller probabilities: how much effort should go into this if we think the chance is 0.1%? 0.01? Or in the other direction, 50%, 90%, etc. In extremes it’s very clear that this should affect how much focus we put into averting it, and I don’t think there’s anything special about 1% vs 10% in this regard.
Another way of thinking about it is that AI is not the only existential risk. If your estimate for AI is 1% in the next ten years but pandemics is 10%, vs 10% for AI and 1% for pandemics, then that should also affect where you think people should focus.
Yes, it is clear. My question was: “Do we have any specific difference in mind about AI strategies for 1 per cent in 10 years vs. 10 per cent in 10 years cases?” If we going to ignore the risk in both cases, there is no difference is it 1 per cent or 10 per cent.
I don’t know any short-term publically available strategy for the 10 years case, no matter what is the probability.
You can extend your argument to even smaller probabilities: how much effort should go into this if we think the chance is 0.1%? 0.01? Or in the other direction, 50%, 90%, etc. In extremes it’s very clear that this should affect how much focus we put into averting it, and I don’t think there’s anything special about 1% vs 10% in this regard.
Another way of thinking about it is that AI is not the only existential risk. If your estimate for AI is 1% in the next ten years but pandemics is 10%, vs 10% for AI and 1% for pandemics, then that should also affect where you think people should focus.
Yes, it is clear. My question was: “Do we have any specific difference in mind about AI strategies for 1 per cent in 10 years vs. 10 per cent in 10 years cases?” If we going to ignore the risk in both cases, there is no difference is it 1 per cent or 10 per cent.
I don’t know any short-term publically available strategy for the 10 years case, no matter what is the probability.