I do agree that there are many good reasons to think that AI practitioners are not AI forecasting experts, such as the fact that they’re, um, obviously not—they generally have no training in it and have spent almost no time on it, and indeed they give very different answers to seemingly-equivalent timelines questions phrased differently. This is a reason to discount the timelines that come from AI practitioner surveys, in favor of whatever other forecasting methods / heuristics you can come up with. It’s not per se a reason to think “definitely no AGI in the next 50 years”.
Well, maybe I should just ask: What probability would you assign to the statement “50 years from today, we will have AGI”? A couple examples:
If you think the probability is <90%, and your intention here is to argue against people who think it should be >90%, well I would join you in arguing against those people too. This kind of technological forecasting is very hard and we should all be pretty humble & uncertain here. (Incidentally, if this is who you’re arguing against, I bet that you’re arguing against fewer people than you imagine.)
If you think the probability is <10%, and your intention here is to argue against people who think it should be >10%, then that’s quite a different matter, and I would strongly disagree with you, and I would very curious how you came to be so confident. I mean, a lot can happen in 50 years, right? What’s the argument?
Have you read https://www.cold-takes.com/where-ai-forecasting-stands-today/ ?
I do agree that there are many good reasons to think that AI practitioners are not AI forecasting experts, such as the fact that they’re, um, obviously not—they generally have no training in it and have spent almost no time on it, and indeed they give very different answers to seemingly-equivalent timelines questions phrased differently. This is a reason to discount the timelines that come from AI practitioner surveys, in favor of whatever other forecasting methods / heuristics you can come up with. It’s not per se a reason to think “definitely no AGI in the next 50 years”.
Well, maybe I should just ask: What probability would you assign to the statement “50 years from today, we will have AGI”? A couple examples:
If you think the probability is <90%, and your intention here is to argue against people who think it should be >90%, well I would join you in arguing against those people too. This kind of technological forecasting is very hard and we should all be pretty humble & uncertain here. (Incidentally, if this is who you’re arguing against, I bet that you’re arguing against fewer people than you imagine.)
If you think the probability is <10%, and your intention here is to argue against people who think it should be >10%, then that’s quite a different matter, and I would strongly disagree with you, and I would very curious how you came to be so confident. I mean, a lot can happen in 50 years, right? What’s the argument?