I discuss expert views here. I don’t put much weight on the superforecaster estimates you mention at this point because they were made in 2022, before the dramatic shortening in timelines due to chatGPT (let alone reasoning models).
They also (i) made compute forecasts that were very wrong (ii) don’t seem to know that much about AI (iii) were selected for expertise in forecasting near-term political events, which might not generalise very well to longer-term forecasting of a new technology.
I agree we should consider the forecast, but I think it’s ultimately pretty weak evidence.
The AI experts survey also found a 25% chance of AI that “can do all tasks better than a human” by 2032. I don’t know why they think it’ll take so much longer to “automate all jobs” – it seems likely they’re just not thinking about it very carefully (especially since they estimate ~50% of an intelligence explosion starting after AI can do “all tasks”); or it could be because they think there will be a bunch of jobs where people have a strong preference for a human to be in them (e.g. priest, artist), even if AI is technically better at everything.
The AI experts have also been generally too pessimistic and e.g. in 2023 predicted that AI couldn’t do simple Python programming until 2025, though it could probably already do that at the time. I expect their answers in the next survey will be shorter again. And they’re also not experts in forecasting.
I discuss expert views here. I don’t put much weight on the superforecaster estimates you mention at this point because they were made in 2022, before the dramatic shortening in timelines due to chatGPT (let alone reasoning models).
They also (i) made compute forecasts that were very wrong (ii) don’t seem to know that much about AI (iii) were selected for expertise in forecasting near-term political events, which might not generalise very well to longer-term forecasting of a new technology.
I agree we should consider the forecast, but I think it’s ultimately pretty weak evidence.
The AI experts survey also found a 25% chance of AI that “can do all tasks better than a human” by 2032. I don’t know why they think it’ll take so much longer to “automate all jobs” – it seems likely they’re just not thinking about it very carefully (especially since they estimate ~50% of an intelligence explosion starting after AI can do “all tasks”); or it could be because they think there will be a bunch of jobs where people have a strong preference for a human to be in them (e.g. priest, artist), even if AI is technically better at everything.
The AI experts have also been generally too pessimistic and e.g. in 2023 predicted that AI couldn’t do simple Python programming until 2025, though it could probably already do that at the time. I expect their answers in the next survey will be shorter again. And they’re also not experts in forecasting.