Inspired in part by the EA Forum’s recent debate week, Metaculus is running a “focus week” this week, aimed at trying to make intellectual progress on the issue of “What will the world look like five years after AGI (assuming that humans are not extinct)[1]?”
Leaders of AGI companies, while vocal about some things they anticipate in a post-AGI world (for example, bullishness in AGI making scientific advances), seem deliberately vague about other aspects. For example, power (will AGI companies have a lot of it? all of it?), whether some of the scientific advances might backfire (e.g., a vulnerable world scenario or a race-to-the-bottom digital minds takeoff), and how exactly AGI will be used for “the benefit of all.”
Those interested: head over here. You can participate by:
Forecasting
Commenting
Comments are especially valuable on long-term questions, because the forecasting community has less of a track record at these time scales.[2][3]
Writing questions
There may well be some gaps in the admin-created question set.[4] We welcome question contributions from users.
The focus week will likely be followed by an essay contest, since a large part of the value in this initiative, we believe, lies in generating concrete stories for how the future might play out (and for what the inflection points might be). More details to come.[5]
This is not to say that we firmly believe extinction won’t happen. I personally put p(doom) at around 60%. At the same time, however, as I have previouslywritten, I believe that more important trajectory changes lie ahead if humanity does manage to avoid extinction, and that it is worth planning for these things now.
With short-term questions on things like geopolitics, I think one should just basically defer to the Community Prediction. Conversely, with certain long-term questions I believe it’s important to interrogate how forecasters are reasoning about the issue at hand before assigning their predictions too much weight. Forecasters can help themselves by writing comments that explain their reasoning.
In addition, stakeholders we work with, who look at our questions with a view to informing their grantmaking, policymaking, etc., frequently say that they would find more comments valuable in helping bring context to the Community Prediction.
Update: I ended up leaving Metaculus fairly soon after writing this post. I think that means the essay contest is less likely to happen, but I guess stay tuned in case it does.
‘Five Years After AGI’ Focus Week happening over at Metaculus.
Inspired in part by the EA Forum’s recent debate week, Metaculus is running a “focus week” this week, aimed at trying to make intellectual progress on the issue of “What will the world look like five years after AGI (assuming that humans are not extinct)[1]?”
Leaders of AGI companies, while vocal about some things they anticipate in a post-AGI world (for example, bullishness in AGI making scientific advances), seem deliberately vague about other aspects. For example, power (will AGI companies have a lot of it? all of it?), whether some of the scientific advances might backfire (e.g., a vulnerable world scenario or a race-to-the-bottom digital minds takeoff), and how exactly AGI will be used for “the benefit of all.”
Forecasting questions for the week range from “Percentage living in poverty?” to “Nuclear deterrence undermined?” to “‘Long reflection’ underway?”
Those interested: head over here. You can participate by:
Forecasting
Commenting
Comments are especially valuable on long-term questions, because the forecasting community has less of a track record at these time scales.[2][3]
Writing questions
There may well be some gaps in the admin-created question set.[4] We welcome question contributions from users.
The focus week will likely be followed by an essay contest, since a large part of the value in this initiative, we believe, lies in generating concrete stories for how the future might play out (and for what the inflection points might be). More details to come.[5]
This is not to say that we firmly believe extinction won’t happen. I personally put p(doom) at around 60%. At the same time, however, as I have previouslywritten, I believe that more important trajectory changes lie ahead if humanity does manage to avoid extinction, and that it is worth planning for these things now.
Moreover, I personally take Nuño Sempere’s “Hurdles of using forecasting as a tool for making sense of AI progress” piece seriously, especially the “Excellent forecasters and Superforecasters™ have an imperfect fit for long-term questions” part.
With short-term questions on things like geopolitics, I think one should just basically defer to the Community Prediction. Conversely, with certain long-term questions I believe it’s important to interrogate how forecasters are reasoning about the issue at hand before assigning their predictions too much weight. Forecasters can help themselves by writing comments that explain their reasoning.
In addition, stakeholders we work with, who look at our questions with a view to informing their grantmaking, policymaking, etc., frequently say that they would find more comments valuable in helping bring context to the Community Prediction.
All blame on me, if so.
Update: I ended up leaving Metaculus fairly soon after writing this post. I think that means the essay contest is less likely to happen, but I guess stay tuned in case it does.