Did you look at the metaculus resolution criteria? They seem extremely weak to me, would be intersted to know which critiera you think o3 (or whatever the best OAI model is) is furthest away from.
To be honest I did not read the post, I just looked at the poll questions. I was thinking of AGI in the way I would define it*, or as the other big Metaculus AGI question defines it. For the “weakly general AI” question, yeah I think 50% chance is fair, maybe even higher than 50%.
*I don’t have a precise definition but I think of it as an AI that can do pretty much any intellectual task that an average human can do
Did you look at the metaculus resolution criteria? They seem extremely weak to me, would be intersted to know which critiera you think o3 (or whatever the best OAI model is) is furthest away from.
To be honest I did not read the post, I just looked at the poll questions. I was thinking of AGI in the way I would define it*, or as the other big Metaculus AGI question defines it. For the “weakly general AI” question, yeah I think 50% chance is fair, maybe even higher than 50%.
*I don’t have a precise definition but I think of it as an AI that can do pretty much any intellectual task that an average human can do
Yeah that’s fair. I’m a lot more bullish on getting AI systems that satisfy the linked question’s definition than my own one.