You can now forecast on how much AI benchmark progress will continue to be underestimated by the Metaculus Community Prediction (CP) on this Metaculus question! Thanks @Javier Prieto for prompting us to think more about this and inspiring this question!
Predict a distribution with a mean of
≈0.5, if you expect the CP to be decently calibrated or just aren’t sure about the direction of bias,
>0.5, if you think the CP will continue to underestimate AI benchmark progress,
<0.5, if you think the CP will overestimate AI benchmark progress, e.g. by overreacting to this post.
Here is a Colab Notebook to get you started with some simulations.
And don’t forget to update your forecasts about the AI benchmark progress questions in question, if the CP on this one has a mean far away from 0.5!
Disclaimer: I work for Metaculus.
You can now forecast on how much AI benchmark progress will continue to be underestimated by the Metaculus Community Prediction (CP) on this Metaculus question! Thanks @Javier Prieto for prompting us to think more about this and inspiring this question!
Predict a distribution with a mean of
≈0.5, if you expect the CP to be decently calibrated or just aren’t sure about the direction of bias,
>0.5, if you think the CP will continue to underestimate AI benchmark progress,
<0.5, if you think the CP will overestimate AI benchmark progress, e.g. by overreacting to this post.
Here is a Colab Notebook to get you started with some simulations.
And don’t forget to update your forecasts about the AI benchmark progress questions in question, if the CP on this one has a mean far away from 0.5!