In standard models, a big dose of AI boosts productivity, which in turn boosts the return on capital, which then raises real interest rates.
I am less convinced. For one thing, I believe most of the gains from truly fundamental innovations are not captured by capital. Was Gutenberg a billionaire? The more fundamental the innovation, the more the import of the core idea can spread to many things and to many sectors.
Furthermore, over the centuries real rates of return seem to be falling, even though there are some high productivity eras, such as the 1920s, during that time. The long-run secular trend might overwhelm the temporary productivity blips, I simply do not know.
I do think AI is likely to increase the variance of relative prices. Observers disagree where the major impacts will be felt, but possibly some prices will fall a great deal — tutoring and medical diagnosis? — and other prices will not. Furthermore, only some individuals will enjoy those relative price declines, as many may remain skittish about AI for quite a few years, possibly an entire generation.
That heterogeneity and lack of stasis will make it harder to infer real interest rates from observed nominal interest rates. Converting nominal to real variables is easiest under conditions of relative stasis, but that is exactly what AI is likely to disrupt. Furthermore, real inflation rates, and thus real interest rates, across different individuals, are likely to increase in their variance.
Overall, that blurring of nominal and real will make the Fed’s job harder. And it will be harder for Treasury to forecast what will be “forthcoming real interest rates.”
Tyler Cowen on the effect of AGI on real rates: