I kind of object to the title of this post. It’s not really AI forecasting you want, insofar as forecasting is understood as generating fairly precise numerical estimates through some finding a reference class, establishing a base rate, and applying a beautiful sparkle of intuition. You’re making the case for AI informed speculation, which is a completely different thing altogether. The climate analogy you make is pretty dubious because we have a huge historical sample of past climates and at least some understanding of the things that drove climate change historically, so we can build some reasonably predictive climate models. This is not the case for AGI and I doubt we actually can reduce our uncertainty much.
You’re right that “forecasting” might not be the right word. Informed speculation might be more accurate, but that might confuse people, since there’s already plenty of work people call “AI forecasting” that looks similar to what I’m talking about.
I also think that there are a lot of ways in which AI forecasting can be done in the sense you described, by “generating fairly precise numerical estimates through some finding a reference class, establishing a base rate, and applying a beautiful sparkle of intuition”. For example, if you look at Epoch’s website, you can find work that follows that methodology, e.g. here.
I also agree that climate change researchers have much more access to historical data and, in some ways, the problem they’re working on is easier than the problem I’m trying to work on. I still think that AI forecasting and climate forecasting are conceptually similar, however. And in fact, to the extent that AI plays a large role in shaping the future of life on Earth, climate forecasts should probably take AI into account. So, these problems are interrelated.
Informed speculation might … confuse people, since there’s already plenty of work people call “AI forecasting” that looks similar to what I’m talking about.
Yes, I think using the term “forecasting” for what you do is established usage—it’s effectively a technical term. Calling it “informed speculation about AI” in the title would not be helpful, in my view.
I kind of object to the title of this post. It’s not really AI forecasting you want, insofar as forecasting is understood as generating fairly precise numerical estimates through some finding a reference class, establishing a base rate, and applying a beautiful sparkle of intuition. You’re making the case for AI informed speculation, which is a completely different thing altogether. The climate analogy you make is pretty dubious because we have a huge historical sample of past climates and at least some understanding of the things that drove climate change historically, so we can build some reasonably predictive climate models. This is not the case for AGI and I doubt we actually can reduce our uncertainty much.
You’re right that “forecasting” might not be the right word. Informed speculation might be more accurate, but that might confuse people, since there’s already plenty of work people call “AI forecasting” that looks similar to what I’m talking about.
I also think that there are a lot of ways in which AI forecasting can be done in the sense you described, by “generating fairly precise numerical estimates through some finding a reference class, establishing a base rate, and applying a beautiful sparkle of intuition”. For example, if you look at Epoch’s website, you can find work that follows that methodology, e.g. here.
I also agree that climate change researchers have much more access to historical data and, in some ways, the problem they’re working on is easier than the problem I’m trying to work on. I still think that AI forecasting and climate forecasting are conceptually similar, however. And in fact, to the extent that AI plays a large role in shaping the future of life on Earth, climate forecasts should probably take AI into account. So, these problems are interrelated.
Yes, I think using the term “forecasting” for what you do is established usage—it’s effectively a technical term. Calling it “informed speculation about AI” in the title would not be helpful, in my view.
Great post, btw.