The basic reason for the trend continuing so far is that NVIDIA et al have diverted normal compute expenditures into the AI boom.
I agree that the trend will stop, and it will stop around 2027-2033 (my widest uncertainty lies here), and once that happens the probability of having AGI soon will go down quite a bit (if it hasn’t happened by then).
I don’t understand what you’re trying to say here. By “the trend”, do you mean Nvidia’s revenue growth? And what do you mean by “have diverted normal compute expenditures into the AI boom”?
I mean the trend of very fast compute increases dedicated to AI, and what I mean is that fabs and chip manufacturers have switched their customers to AI companies.
The basic reason for the trend continuing so far is that NVIDIA et al have diverted normal compute expenditures into the AI boom.
I agree that the trend will stop, and it will stop around 2027-2033 (my widest uncertainty lies here), and once that happens the probability of having AGI soon will go down quite a bit (if it hasn’t happened by then).
I don’t understand what you’re trying to say here. By “the trend”, do you mean Nvidia’s revenue growth? And what do you mean by “have diverted normal compute expenditures into the AI boom”?
I mean the trend of very fast compute increases dedicated to AI, and what I mean is that fabs and chip manufacturers have switched their customers to AI companies.
I still don’t follow. What point are you trying to make about my comment or about Ege Erdil’s post?
I’m trying to identify why the trend has lasted, so that we can predict when the trend will break down.
That was the purpose of my comment.