“AI: Forecasters on the community forecasting platform Metaculus think that artificial intelligent systems that are better than humans at all relevant tasks will be created in 2042.”
How do you get this from the questions’ operationalization?
I thought that was what was meant by AGI? I agree that the operationalisation doesn’t state that explicitly, but I thought it was implied. Do you think I should change in the report?
I think this strongly depends on how much weight you expect forcasters on metaculus to put onto the actual operationalization rather than the question’s “vibe”. I personally expect quite a bit of weight on the exact operationalization, so I am generally not very happy with how people have been talking about this specific forecast (the term “AGI” often seems to invoke associations that are not backed by the forecast’s operationalization), and would prefer a more nuanced statement in the report.
(Note, that you might believe that the gap between the resolution criteria of the question and more colloqiual interpretations of “AGI” is very small, but this would seem to require an additional argument on top of the metaculus forecast).
“AI: Forecasters on the community forecasting platform Metaculus think that artificial intelligent systems that are better than humans at all relevant tasks will be created in 2042.”
How do you get this from the questions’ operationalization?
I thought that was what was meant by AGI? I agree that the operationalisation doesn’t state that explicitly, but I thought it was implied. Do you think I should change in the report?
I think this strongly depends on how much weight you expect forcasters on metaculus to put onto the actual operationalization rather than the question’s “vibe”. I personally expect quite a bit of weight on the exact operationalization, so I am generally not very happy with how people have been talking about this specific forecast (the term “AGI” often seems to invoke associations that are not backed by the forecast’s operationalization), and would prefer a more nuanced statement in the report.
(Note, that you might believe that the gap between the resolution criteria of the question and more colloqiual interpretations of “AGI” is very small, but this would seem to require an additional argument on top of the metaculus forecast).