Given that NVIDIA dominates the AI-relevant hardware market, we can infer that, since its data center revenue is currently on the order of $15 billion per year, and given the compound annual growth rate in its revenue at ~21.3% since 2012 plus rapid hardware progress in the meantime, there’s likely less than $75 billion dollars worth of AI hardware currently in the world right now on par with the H100 80 GB.
I don’t understand where the $75 billion is coming from. It looks like you are dividing the $15 billion by the 21.3%, but that doesn’t make sense as the 21.3% is revenue growth. Surely market share needs to come into it somewhere?
The $75 billion figure is an upper bound for how many AI GPUs could have been sold over the last ~10 years (since the start of the deep learning revolution), adjusted by hardware efficiency. There are multiple ways to estimate how many AI GPUs have been sold over the last 10 years, but I think any reasonable estimate of this figure will give you something lower than $75 billion, making the figure conservative.
To give a little bit more rigor, let’s take the number of $4.28 billion from NVIDIA’s most recent quarterly data center revenue and assume that this revenue number has grown at 4.95% every quarter for the last 10 years, to match the 21.3% yearly growth number in the post (note: this is conservative, since the large majority of the data center revenue has been in the last few years). For hardware progress, let’s also assume that FLOP/$ has been increasing at a rate of 7% per quarter, which would mean a doubling time of 2.5 years, which is roughly equal to the trend before hardware acceleration for deep learning.
Combining these facts, we can start with $4.28B(1.0495+0.07)40=$0.0468B as an estimate of NVIDIA’s AI quarterly revenue 10 years ago adjusted for hardware efficiency, and sum over the next 40 quarters of growth to get $56.37 billion. NVIDIA is responsible for about 80% of the market share, at least according to most sources, so let’s just say that the true amount is $56.36∗10.8=$70.45B.
Combining these facts[2], we can start with $6.13B(1.0662+0.0873)30=$0.0852B as an estimate of NVIDIA’s AI quarterly revenue 7 years ago adjusted for hardware efficiency, and sum over the next 30 quarters of growth (start of 2016 to Q2 2023) to get $39.7B[3]. NVIDIA is responsible for about 80% of the market share, at least according to most sources, so let’s just say that the true amount is $39.7B∗10.8=$49.6B. Call it $50B.
I don’t understand where the $75 billion is coming from. It looks like you are dividing the $15 billion by the 21.3%, but that doesn’t make sense as the 21.3% is revenue growth. Surely market share needs to come into it somewhere?
The $75 billion figure is an upper bound for how many AI GPUs could have been sold over the last ~10 years (since the start of the deep learning revolution), adjusted by hardware efficiency. There are multiple ways to estimate how many AI GPUs have been sold over the last 10 years, but I think any reasonable estimate of this figure will give you something lower than $75 billion, making the figure conservative.
To give a little bit more rigor, let’s take the number of $4.28 billion from NVIDIA’s most recent quarterly data center revenue and assume that this revenue number has grown at 4.95% every quarter for the last 10 years, to match the 21.3% yearly growth number in the post (note: this is conservative, since the large majority of the data center revenue has been in the last few years). For hardware progress, let’s also assume that FLOP/$ has been increasing at a rate of 7% per quarter, which would mean a doubling time of 2.5 years, which is roughly equal to the trend before hardware acceleration for deep learning.
Combining these facts, we can start with $4.28B(1.0495+0.07)40=$0.0468B as an estimate of NVIDIA’s AI quarterly revenue 10 years ago adjusted for hardware efficiency, and sum over the next 40 quarters of growth to get $56.37 billion. NVIDIA is responsible for about 80% of the market share, at least according to most sources, so let’s just say that the true amount is $56.36∗10.8=$70.45B.
I’m wondering if it might be better taking the cut off as 2016, the start of the large-scale era[1]? I’m guessing that it might give similar results. Let’s see: from 2016 to 2022, NVidia revenue grows from $6.13B to $28.56B, a compound growth of 29.2% per year, or 6.62% per quarter. For FLOP/$, it seems more relevant to take “the trend implied by GPUs used in ML”, which “shows a doubling time of 2.07 years”. This translates into FLOP/$ increasing at 8.73% per quarter.
Combining these facts[2], we can start with $6.13B(1.0662+0.0873)30=$0.0852B as an estimate of NVIDIA’s AI quarterly revenue 7 years ago adjusted for hardware efficiency, and sum over the next 30 quarters of growth (start of 2016 to Q2 2023) to get $39.7B[3]. NVIDIA is responsible for about 80% of the market share, at least according to most sources, so let’s just say that the true amount is $39.7B∗10.8=$49.6B. Call it $50B.
This also because it seems unlikely that pre-2016 machines would be economical to run from an energy use perspective.
I’m copying Matthew’s final paragraph here, using my numbers.
Note that this is in line with my answer of $35.4B here for Matthew’s calc.
Thanks! Although I’m getting $35.4B where you are getting $56.37B.
Looks like my calculation was probably in error then. Thanks for checking.