What is the actionable difference between “1-2 per cent” and “10 per cent” predictions? If we knew that an asteroid is coming to Earth and it will hit the Earth with one of these probabilities, how our attempts to diverge it would depend on the probability of the impact?
Should we ignore 1 per cent probability, but go all-in in preventing 10 per cent probability?
If there is no difference in actions, the difference in probability estimates is rather meaningless.
You can extend your argument to even smaller probabilities: how much effort should go into this if we think the chance is 0.1%? 0.01? Or in the other direction, 50%, 90%, etc. In extremes it’s very clear that this should affect how much focus we put into averting it, and I don’t think there’s anything special about 1% vs 10% in this regard.
Another way of thinking about it is that AI is not the only existential risk. If your estimate for AI is 1% in the next ten years but pandemics is 10%, vs 10% for AI and 1% for pandemics, then that should also affect where you think people should focus.
Yes, it is clear. My question was: “Do we have any specific difference in mind about AI strategies for 1 per cent in 10 years vs. 10 per cent in 10 years cases?” If we going to ignore the risk in both cases, there is no difference is it 1 per cent or 10 per cent.
I don’t know any short-term publically available strategy for the 10 years case, no matter what is the probability.
What is the actionable difference between “1-2 per cent” and “10 per cent” predictions? If we knew that an asteroid is coming to Earth and it will hit the Earth with one of these probabilities, how our attempts to diverge it would depend on the probability of the impact?
Should we ignore 1 per cent probability, but go all-in in preventing 10 per cent probability?
If there is no difference in actions, the difference in probability estimates is rather meaningless.
You can extend your argument to even smaller probabilities: how much effort should go into this if we think the chance is 0.1%? 0.01? Or in the other direction, 50%, 90%, etc. In extremes it’s very clear that this should affect how much focus we put into averting it, and I don’t think there’s anything special about 1% vs 10% in this regard.
Another way of thinking about it is that AI is not the only existential risk. If your estimate for AI is 1% in the next ten years but pandemics is 10%, vs 10% for AI and 1% for pandemics, then that should also affect where you think people should focus.
Yes, it is clear. My question was: “Do we have any specific difference in mind about AI strategies for 1 per cent in 10 years vs. 10 per cent in 10 years cases?” If we going to ignore the risk in both cases, there is no difference is it 1 per cent or 10 per cent.
I don’t know any short-term publically available strategy for the 10 years case, no matter what is the probability.