I think a major issue is that the people who would be best at predicting AGI usually don’t want to share their rationale.
Gears-level models of the phenomenon in question are highly useful in making accurate predictions. Those with the best models are either worriers who don’t want to advance timelines, or enthusiasts who want to build it first. Neither has an incentive to convince the world it’s coming soon by sharing exactly how that might happen.
The exceptions are people who have really thought about how to get from AI to AGI, but are not in the leading orgs and are either uninterested in racing or want to attract funding and attention for their approach. Yann LeCun comes to mind.
Imagine trying to predict the advent of heavier-than-air flight without studying either birds or mechanical engineering. You’d get predictions like the ones we saw historically—so wild as to be worthless, except those from the people actually trying to achieve that goal.
(copied from LW comment since the discussion is happening over here)
This seems plausible, perhaps more plausible 3 years ago. AGI is so mainstream now that I imagine there are many people who are motivated to advance the conversation but have no horse in the race.
If only the top cadre of AI experts are capable of producing the models, then yes, we might have a problem of making such knowledge a public good.
Perhaps philanthropists can provide bigger incentives to share than their incentives not to share.
I think a major issue is that the people who would be best at predicting AGI usually don’t want to share their rationale.
Gears-level models of the phenomenon in question are highly useful in making accurate predictions. Those with the best models are either worriers who don’t want to advance timelines, or enthusiasts who want to build it first. Neither has an incentive to convince the world it’s coming soon by sharing exactly how that might happen.
The exceptions are people who have really thought about how to get from AI to AGI, but are not in the leading orgs and are either uninterested in racing or want to attract funding and attention for their approach. Yann LeCun comes to mind.
Imagine trying to predict the advent of heavier-than-air flight without studying either birds or mechanical engineering. You’d get predictions like the ones we saw historically—so wild as to be worthless, except those from the people actually trying to achieve that goal.
(copied from LW comment since the discussion is happening over here)
This seems plausible, perhaps more plausible 3 years ago. AGI is so mainstream now that I imagine there are many people who are motivated to advance the conversation but have no horse in the race.
If only the top cadre of AI experts are capable of producing the models, then yes, we might have a problem of making such knowledge a public good.
Perhaps philanthropists can provide bigger incentives to share than their incentives not to share.