Great point, Dillon! I strongly upvoted it. I very much agree a 100 % chance of full automation by 2103 is too high. This reminds me of a few “experts” and “superforecasters” in the Existential Risk Persuasion Tournament (XPT) having predicted a probability of human extinction from 2023 to 2100 of exactly 0. “Null values” below refers to values of exactly 0
In this case, people could be predicting an extinction risk of exactly 0 as representing a very low value. However, for the predictions about automation, it would be really strange if people replied 100 % to mean something like 90 %, so I assume they are just overconfident.
Note that at least 25% of ‘AI experts’ believe there’s a 100% probability of automation by 2103.… doesn’t seem like they’re really experts to me
Great point, Dillon! I strongly upvoted it. I very much agree a 100 % chance of full automation by 2103 is too high. This reminds me of a few “experts” and “superforecasters” in the Existential Risk Persuasion Tournament (XPT) having predicted a probability of human extinction from 2023 to 2100 of exactly 0. “Null values” below refers to values of exactly 0
In this case, people could be predicting an extinction risk of exactly 0 as representing a very low value. However, for the predictions about automation, it would be really strange if people replied 100 % to mean something like 90 %, so I assume they are just overconfident.