Thanks David. I agree that the Metaculus question is a mediocre proxy for AGI, for the reasons you say. We included it primarily because it shows the magnitude of the AI timelines update that we and others have made over the past few years.
In case it’s helpful context, here are two footnotes that I included in the strategy document that this post is based on, but that we cut for brevity in this EA Forum version:
We define AGI using the Morris, et al./Deepmind (2024) definition (see table 1) of “competent AGI” for the purposes of this document: an AI system that performs as well as at least 50% of skilled adults at a wide range of non-physical tasks, including metacognitive tasks like learning new skills.
This Deepmind definition of AGI is the one that we primarily use internally. I think that we may get strategically significant AI capabilities before this though, for example via automated AI R&D.
On the Metaculus definition, I included this footnote:
The headline Metaculus forecast on AGI doesn’t fully line up with the Morris, et al. (2024) definition of AGI that we use in footnote 2. For example, the Metaculus definition includes robotic capabilities, and doesn’t include being able to successfully do long-term planning and execution loops. But nonetheless I think this is the closest proxy for an AGI timeline that I’ve found on a public prediction market.
Thanks David. I agree that the Metaculus question is a mediocre proxy for AGI, for the reasons you say. We included it primarily because it shows the magnitude of the AI timelines update that we and others have made over the past few years.
In case it’s helpful context, here are two footnotes that I included in the strategy document that this post is based on, but that we cut for brevity in this EA Forum version:
This Deepmind definition of AGI is the one that we primarily use internally. I think that we may get strategically significant AI capabilities before this though, for example via automated AI R&D.
On the Metaculus definition, I included this footnote:
Thanks, that is reassuring.