As I understand it your argument is “Even if AI could lead to explosive growth, we’ll choose not to do it because we don’t yet know what we want”. This seems pretty wild, does literally no one want to make tons of money in this scenario?
I don’t think your summary is wrong as such, but it’s not how I think about it.
Suppose we’ve got great AI that, in practice, we still use with a wide variety of control inputs (“make better batteries”, “create software that does X”). Then it could be the case—if AI enables explosive growth in other domains—that “production of control inputs” becomes the main production bottleneck.
Alternatively, suppose there’s a “make me a lot of money” AI and money making is basically about making stuff that people want to buy. You can sell more stuff that people are already known to want—but that runs into the limit that people only want a finite amount of stuff. You could alternatively sell new stuff that people want but don’t know it yet. This is still limited by the number of people in the world, how often each wants to consider adopting a new technology and what things someone with life history X is actually likely to adopt and how long it takes them to make this decision. These things seem unlikely to scale indefinitely with AI capability.
This could be defeated by either money not being about making stuff people want—which seems fairly likely, but in this case I don’t really know what to think—or AI capability leading to (explosive?) human population expansion.
We might just be talking past each other—I’m not saying this is a reason to be confident explosive growth won’t happen and I agree it looks like growth could go much faster before hitting any limits like this. I just meant to say “here’s a speculative mechanism that could break some of the explosive growth models”
As I understand it your argument is “Even if AI could lead to explosive growth, we’ll choose not to do it because we don’t yet know what we want”. This seems pretty wild, does literally no one want to make tons of money in this scenario?
I don’t think your summary is wrong as such, but it’s not how I think about it.
Suppose we’ve got great AI that, in practice, we still use with a wide variety of control inputs (“make better batteries”, “create software that does X”). Then it could be the case—if AI enables explosive growth in other domains—that “production of control inputs” becomes the main production bottleneck.
Alternatively, suppose there’s a “make me a lot of money” AI and money making is basically about making stuff that people want to buy. You can sell more stuff that people are already known to want—but that runs into the limit that people only want a finite amount of stuff. You could alternatively sell new stuff that people want but don’t know it yet. This is still limited by the number of people in the world, how often each wants to consider adopting a new technology and what things someone with life history X is actually likely to adopt and how long it takes them to make this decision. These things seem unlikely to scale indefinitely with AI capability.
This could be defeated by either money not being about making stuff people want—which seems fairly likely, but in this case I don’t really know what to think—or AI capability leading to (explosive?) human population expansion.
In defence of this not being completely wild speculation: advertising already comprises a nontrivial fraction of economic activity and seems to be growing faster than other sectors https://www.statista.com/statistics/272443/growth-of-advertising-spending-worldwide/
(Although only a small fraction of advertising is promoting the adoption of new tech)
I don’t disagree with anything you’ve written here, but I’m not seeing why the limits they impose are anywhere close to where we are today.
We might just be talking past each other—I’m not saying this is a reason to be confident explosive growth won’t happen and I agree it looks like growth could go much faster before hitting any limits like this. I just meant to say “here’s a speculative mechanism that could break some of the explosive growth models”
Ah fair enough. In that case I agree.