You might find the thread “The AI messiah” and the comments there interesting.
You quote AI results from the 70s and 90s as examples of overly optimistic AI predictions.
In recent years there are many many examples of predictions being too conservative (e.g. Google beating Lee Sedol at Go in 2016, GPT-3, Minerva, Imagen …). Self-driving seems to be the only field where progress has been slower than some expected. See e.g. https://bounded-regret.ghost.io/ai-forecasting-one-year-in/? “progress on ML benchmarks happened significantly faster than forecasters expected” (even if it was sensitive to the exact timing of a single paper, I think it’s a useful data point).
Would that make you increase the importance of AI risk as a priority?
I was unaware of these more recent predictions, these increase the credibility of AI risk in my mind to some degree.
Prior to generalized artificial intelligence actually existing I couldn’t ever view it as a higher priority than climate change. If it existed but was contained I would treat it as a higher priority, and if there were points where it almost escaped but didn’t even more so.
You might find the thread “The AI messiah” and the comments there interesting.
You quote AI results from the 70s and 90s as examples of overly optimistic AI predictions.
In recent years there are many many examples of predictions being too conservative (e.g. Google beating Lee Sedol at Go in 2016, GPT-3, Minerva, Imagen …).
Self-driving seems to be the only field where progress has been slower than some expected. See e.g.
https://bounded-regret.ghost.io/ai-forecasting-one-year-in/? “progress on ML benchmarks happened significantly faster than forecasters expected” (even if it was sensitive to the exact timing of a single paper, I think it’s a useful data point).
Would that make you increase the importance of AI risk as a priority?
I will check out the article.
I was unaware of these more recent predictions, these increase the credibility of AI risk in my mind to some degree.
Prior to generalized artificial intelligence actually existing I couldn’t ever view it as a higher priority than climate change. If it existed but was contained I would treat it as a higher priority, and if there were points where it almost escaped but didn’t even more so.