[...] ChatGPT is based on a version of GPT-3. It has been estimated that training GPT-3 consumed 1,287 MWh which emitted 552 tons CO2e [1].
Using the ML CO2 Impact calculator, we can estimate ChatGPT’s daily carbon footprint to 23.04 kgCO2e. [...] ChatGPT probably handles way more daily requests [compared to Bloom], so it might be fair to expect it has a larger carbon footprint.
I think a central idea here is that superintelligence could innovate and thus find more energy-efficient means of running itself. We already see a trend of language models with the same capabilities getting more energy efficient over time through algorithmic improvement and better parameters/​data ratios. So even if the first Superintelligence requires a lot of energy, the systems developed in the period after it will probably need much less.
Thanks a lot, Felix! That’s very generous and some links have even more relevant stuff. Apparently, ChatGPT uses around 11870 kWh per day whereas the average human body uses 2,4 kWh.
Epistemic status: quick google search, uncertain about everything, have not read the linked papers. ~15 minutes of time investment.
Source 1
The Carbon Footprint of ChatGPT
[...] ChatGPT is based on a version of GPT-3. It has been estimated that training GPT-3 consumed 1,287 MWh which emitted 552 tons CO2e [1].
Using the ML CO2 Impact calculator, we can estimate ChatGPT’s daily carbon footprint to 23.04 kgCO2e.
[...] ChatGPT probably handles way more daily requests [compared to Bloom], so it might be fair to expect it has a larger carbon footprint.
Source 2
The carbon footprint of ChatGPT
3.82 tCOâ‚‚e per day
Also, maybe take a look into this paper about a different language model:
ESTIMATING THE CARBON FOOTPRINT OF BLOOM, A 176B PARAMETER LANGUAGE MODEL
https://​​arxiv.org/​​pdf/​​2211.02001.pdf
Quantifying the Carbon Emissions of Machine Learning
https://​​arxiv.org/​​pdf/​​1910.09700.pdf
You can play a bit with this calculator, which was also used in source 1:
ML CO2 Impact
https://​​mlco2.github.io/​​impact/​​
I think a central idea here is that superintelligence could innovate and thus find more energy-efficient means of running itself. We already see a trend of language models with the same capabilities getting more energy efficient over time through algorithmic improvement and better parameters/​data ratios. So even if the first Superintelligence requires a lot of energy, the systems developed in the period after it will probably need much less.
Thanks a lot, Felix! That’s very generous and some links have even more relevant stuff. Apparently, ChatGPT uses around 11870 kWh per day whereas the average human body uses 2,4 kWh.