75x more capable than the best human human at AI research
“wildly superhuman” coding, hacking and politics
330K Superhuman AI Researcher copies thinking at 57x human speed”
In their slowest projection, by April 2028, OpenAI has achieved generalised superintelligence.
But you’re only willing to bet US GDP grows just 10%, in just one year, across the next 15? The US did 7.4% in 1984. Within 10 years—five years before your proposed bet resolves—you’ve predicted a 95-trillion-fold increase in AI research capacity under a ‘conservative scenario.’ According to your eighth section, this won’t cause major bottlenecks elsewhere that would seriously stifle growth.
If this is really the best bet you’re willing to offer, one of three things is true:
You’re wildly risk averse
You don’t believe what you’re writing
You’re misleadingly missing out the fine print (e.g. “I’d put about 60% odds on the kind of growth depicted variously in AI 2027″ 2027 but not any time close to when they actually predict it will happen”)
First of all, the AI 2027 people disagree about the numbers. Lifland’s median is nearer to 2031. I have a good amount of uncertainty, so I wouldn’t be shocked if, say, we don’t get the intelligence explosion for a decadeish.
“you’ve predicted a 95-trillion-fold increase in AI research capacity under a ‘conservative scenario.’” is false. I was just giving that as an example of the rapid exponential growth.
So the answer, in short, is that I’m not very confident in extremely rapid growth within the next few years. I’d probably put +10% GDP growth by 2029 below 50%.
1. “First of all, the AI 2027 people disagree about the numbers”.
That’s irrelevant to your claim that you’d put “60% odds on the kind of growth depicted in AI 2027″
“you’ve predicted a 95-trillion-fold increase in AI research capacity under a ‘conservative scenario.’” is false. I was just giving that as an example of the rapid exponential growth.
Here’s what you wrote:
”This might sound outrageous, but remember: the number of AI models we can run is going up 25x per year! Once we reach human level, if those trends continue (and they show no signs of stopping) it will be as if the number of human researchers is going up 25x per year. 25x yearly increases is a 95-trillion-fold increase in a decade.”
You then go on to outline reasons why it would actually be faster than that. If you aren’t predicting this 95-trillion-fold increase, then either:
1. The trends do indeed show signs of stopping 2. The number of AI models you can run isn’t really going up 25x YOY
We can talk all day, but words are cheap. I’d much rather bet. Bets force you to get specific about what you actually believe. They make false predictions costly, true ones profitable. They signal what you actually believe, not what you think writing will get you the most status / clicks / views / shares etc.
What’s the minimum percentage chance of greater than 10% GDP growth in 2029 that you think is plausible given the trends you’re writing about and how much are you willing to bet at those odds? I’d rather bet on an earlier year, but I’d accept 2029 if that’s all you’ve got in you.
To be explicit, I’m trying to work out what you actually believe and what is just sensationalised.
This is a response more befitting Jim Cramer’s Chihuahua than Jeremy Bentham’s Bulldog.
According to AI 2027, before the end of 2027, OpenAI has:
a “country of geniuses in a datacenter.” each:
75x more capable than the best human human at AI research
“wildly superhuman” coding, hacking and politics
330K Superhuman AI Researcher copies thinking at 57x human speed”
In their slowest projection, by April 2028, OpenAI has achieved generalised superintelligence.
But you’re only willing to bet US GDP grows just 10%, in just one year, across the next 15? The US did 7.4% in 1984. Within 10 years—five years before your proposed bet resolves—you’ve predicted a 95-trillion-fold increase in AI research capacity under a ‘conservative scenario.’ According to your eighth section, this won’t cause major bottlenecks elsewhere that would seriously stifle growth.
If this is really the best bet you’re willing to offer, one of three things is true:
You’re wildly risk averse
You don’t believe what you’re writing
You’re misleadingly missing out the fine print (e.g. “I’d put about 60% odds on the kind of growth depicted variously in AI 2027″ 2027
but not any time close to when they actually predict it will happen”)Which is it?
Weirdly aggressive reply.
First of all, the AI 2027 people disagree about the numbers. Lifland’s median is nearer to 2031. I have a good amount of uncertainty, so I wouldn’t be shocked if, say, we don’t get the intelligence explosion for a decadeish.
“you’ve predicted a 95-trillion-fold increase in AI research capacity under a ‘conservative scenario.’” is false. I was just giving that as an example of the rapid exponential growth.
So the answer, in short, is that I’m not very confident in extremely rapid growth within the next few years. I’d probably put +10% GDP growth by 2029 below 50%.
To respond briefly:
1. “First of all, the AI 2027 people disagree about the numbers”.
That’s irrelevant to your claim that you’d put “60% odds on the kind of growth depicted in AI 2027″
“you’ve predicted a 95-trillion-fold increase in AI research capacity under a ‘conservative scenario.’” is false. I was just giving that as an example of the rapid exponential growth.
Here’s what you wrote:
”This might sound outrageous, but remember: the number of AI models we can run is going up 25x per year! Once we reach human level, if those trends continue (and they show no signs of stopping) it will be as if the number of human researchers is going up 25x per year. 25x yearly increases is a 95-trillion-fold increase in a decade.”
You then go on to outline reasons why it would actually be faster than that. If you aren’t predicting this 95-trillion-fold increase, then either:
1. The trends do indeed show signs of stopping
2. The number of AI models you can run isn’t really going up 25x YOY
We can talk all day, but words are cheap. I’d much rather bet. Bets force you to get specific about what you actually believe. They make false predictions costly, true ones profitable. They signal what you actually believe, not what you think writing will get you the most status / clicks / views / shares etc.
What’s the minimum percentage chance of greater than 10% GDP growth in 2029 that you think is plausible given the trends you’re writing about and how much are you willing to bet at those odds? I’d rather bet on an earlier year, but I’d accept 2029 if that’s all you’ve got in you.
To be explicit, I’m trying to work out what you actually believe and what is just sensationalised.