These are good arguments. Some were new to me, many I was already aware of. For me, the overall effect of the arguments, benchmarks, and my own experience is to make me think that a lot of scenarios are plausible. There is a wide uncertainty range. It might well be that AGI takes a long time to happen, but I also see many trends that indicate it could arrive surprisingly quickly.
For you, the overall conclusion from all the arguments is to completely rule out near-term AGI. That still seems quite wildly overconfident, even if there is a decent case being made for long timelines.
Important correction to my comment above: the AI Impacts survey was actually conducted in October 2023, which is 7 months after the release of GPT-4 in March 2023. So, it does actually reflect AI researchers’ views on AGI timelines after given time to absorb the impact of ChatGPT and GPT-4.
The XPT superforecasting survey I mentioned was, however, indeed conducted in 2022 just before the launch of ChatGPT in November 2022. So, that’s still a pre-ChatGPT forecast.
I just published a post here about these forecasts. I also wrote a post about 2 weeks ago that adapted my comments above, although unfortunately it didn’t lead to much discussion. I would love to stimulate more debate about this topic.
It would be great, even, if the EA Forum did some kind of debate week or essay competition around whether near-term AGI is likely. Maybe I will suggest that.
I don’t really have a gripe with people who want to put relatively small probabilities on near-term AGI, like the superforecasters who guessed there’s a 1% chance of AGI by 2030. Who knows anything about anything? Maybe Jill Stein has a 1% chance of winning in 2028! But 50% by 2032 is definitely way too high and I actually don’t think there’s a rational basis for thinking that.
Thanks for the long reply!
These are good arguments. Some were new to me, many I was already aware of. For me, the overall effect of the arguments, benchmarks, and my own experience is to make me think that a lot of scenarios are plausible. There is a wide uncertainty range. It might well be that AGI takes a long time to happen, but I also see many trends that indicate it could arrive surprisingly quickly.
For you, the overall conclusion from all the arguments is to completely rule out near-term AGI. That still seems quite wildly overconfident, even if there is a decent case being made for long timelines.
Important correction to my comment above: the AI Impacts survey was actually conducted in October 2023, which is 7 months after the release of GPT-4 in March 2023. So, it does actually reflect AI researchers’ views on AGI timelines after given time to absorb the impact of ChatGPT and GPT-4.
The XPT superforecasting survey I mentioned was, however, indeed conducted in 2022 just before the launch of ChatGPT in November 2022. So, that’s still a pre-ChatGPT forecast.
I just published a post here about these forecasts. I also wrote a post about 2 weeks ago that adapted my comments above, although unfortunately it didn’t lead to much discussion. I would love to stimulate more debate about this topic.
It would be great, even, if the EA Forum did some kind of debate week or essay competition around whether near-term AGI is likely. Maybe I will suggest that.
I don’t really have a gripe with people who want to put relatively small probabilities on near-term AGI, like the superforecasters who guessed there’s a 1% chance of AGI by 2030. Who knows anything about anything? Maybe Jill Stein has a 1% chance of winning in 2028! But 50% by 2032 is definitely way too high and I actually don’t think there’s a rational basis for thinking that.