This is a big question, of which there are many benchmarks and prediction markets to hint at a potential answer. Here are two potential prediction markets worth looking at:
Yes, I have read those and accepted the truth lots of people believe human level AGI will come in 20 years, and it’s just a matter of time. But I don’t know why people are so confident on this. Do people think the AI algorithms now are well enough to do most of the tasks on earth “theoretically”, and what we need are only fast computing speed?
This is a big question, of which there are many benchmarks and prediction markets to hint at a potential answer. Here are two potential prediction markets worth looking at:
https://www.metaculus.com/questions/5121/date-of-first-agi-strong/
https://www.metaculus.com/questions/3479/date-weakly-general-ai-is-publicly-known/
Yes, I have read those and accepted the truth lots of people believe human level AGI will come in 20 years, and it’s just a matter of time. But I don’t know why people are so confident on this. Do people think the AI algorithms now are well enough to do most of the tasks on earth “theoretically”, and what we need are only fast computing speed?
Or do people think the GPT system now is already very close to AGI? If so, what are the support arguments?(I’ve read Sparks of AGI by OpenAI)