Ironically, one of the two predictions you quote as example of bad prediction, is in fact an example of a good prediction: “The most realistic estimate for a seed AI transcendence is 2020.”
Currently it seems that AGI/superintelligence/singularity/etc. will happen sometime in the 2020′s. Yudkowsky’s median estimate in 1999 was 2020 apparently, so he probably had something like 30% of his probability mass in the 2020s, and maybe 15% of it in the 2025-2030 period when IMO it’s most likely to happen.
Now let’s compare to what other people would have been saying at the time. They would almost all have been saying 0%, and then maybe the smarter and more rational ones would have been saying things like 1%, for the 2025-2030 period.
To put it in nonquantitative terms, almost everyone else in 1999 would have been saying “AGI? Singularity? That’s not a thing, don’t be ridiculous.” The smarter and more rational ones would have been saying “OK it might happen eventually but it’s nowhere in sight, it’s silly to start thinking about it now.” Yudkowsky said “It’s about 21 years away, give or take; we should start thinking about it now.” Now with the benefit of 24 years of hindsight, Yudkowsky was a lot closer to the truth than all those other people.
Also, you didn’t reply to my claim. Who else has been talking about AGI etc. for 20+ years and has a similarly good track record? Which of them managed to only make correct predictions when they were teenagers? Certainly not Kurzweil.
Ironically, one of the two predictions you quote as example of bad prediction, is in fact an example of a good prediction: “The most realistic estimate for a seed AI transcendence is 2020.”
Currently it seems that AGI/superintelligence/singularity/etc. will happen sometime in the 2020′s. Yudkowsky’s median estimate in 1999 was 2020 apparently, so he probably had something like 30% of his probability mass in the 2020s, and maybe 15% of it in the 2025-2030 period when IMO it’s most likely to happen.
Now let’s compare to what other people would have been saying at the time. They would almost all have been saying 0%, and then maybe the smarter and more rational ones would have been saying things like 1%, for the 2025-2030 period.
To put it in nonquantitative terms, almost everyone else in 1999 would have been saying “AGI? Singularity? That’s not a thing, don’t be ridiculous.” The smarter and more rational ones would have been saying “OK it might happen eventually but it’s nowhere in sight, it’s silly to start thinking about it now.” Yudkowsky said “It’s about 21 years away, give or take; we should start thinking about it now.” Now with the benefit of 24 years of hindsight, Yudkowsky was a lot closer to the truth than all those other people.
Also, you didn’t reply to my claim. Who else has been talking about AGI etc. for 20+ years and has a similarly good track record? Which of them managed to only make correct predictions when they were teenagers? Certainly not Kurzweil.