The theoretical computational limit of the Solar System is 1.47x10^49 bits per second.
Part 1
The limit is based on a computer operating at the Landauer Limit, at the temperature of the cosmic microwave background, powered by a Dyson sphere operating at the efficiency of a Carnot engine. [EDIT: this proposed limit is too low, as the Landauer Limit can be broken, it is now just a lower bound.]
Relevant equations
Carnot efficiency ηI=1-(Tc/Th)
Landauer limit E=KbTLn(2)
Bit rate R=PηI /E
Relevant values
Boltzmann constant [Kb] (J K-1) 1.38E-23
Power output of the sun [P] (W) 3.83E+26
Temperature of the surface of the sun [Th] (K) 5.78E+03
Temperature of cosmic microwave background [Tc] (K) 2.73
Calculations
Carnot efficiency ηI=1-(Tc/Th)
ηI=1-(2.73/5.78E+03)
ηI=1.00
Landauer limit E=KbTLn(2)
E=1.38E-23*2.73*0.693
E= 2.61E-23 Joules per bit
Bit rate R=PηI /E
R=3.83E+26*1.00/2.61E-23
R=1.47E+49 bits per second
Notes
Numbers are shown rounded to 3 significant figures, full values were used in calculations.
Part 2
The theoretical computational limit of the solar system is 22 orders of magnitude above the estimated computational ability of all alive humans. This is based on estimates of the number of synapses in the human brain, the update rate of those synapses, and the number of humans alive. This estimate is only an approximation and should be used with caution.
The purpose of this post was to show the limit of computation, and therefore intelligence, is far above all humans combined.
Relevant equations
Bit rate of all humans Rhumans=NsynRsynNhumans
Comparative rate Rc=Rmax/Rhumans
Relevant values
Number of synapses in the human brain [Nsyn] 2.50E+14
Synaptic update rate [Rsyn] (Hz) 500
Number of humans alive [Nhumans] 8.07E+09
Theoretical computational limit [Rmax] (bit s-1) 1.47E+49
Calculation
Bit rate of all humans Rhumans=NsynRsynNhumans
Rhumans=2.50E+14*500*8.07E+09
Rhumans= 1.01E+27
Comparative rate Rc=Rmax/Rhumans
Rc=1.47E+49/1.01E+27
Rc=1E22
Notes
Numbers are shown rounded to 3 significant figures, full values were used in calculations, final result rounded to one significant figure due to low confidence in synaptic update rate.
Synaptic update rate estimated based on a 2 millisecond refractory time of a neuron.
If you’re trying to maximize computational efficiency, instead of building a Dyson sphere, shouldn’t you drop the sun into a black hole and harvest the Hawking radiation?
Hi Robi,
For reference, Anders Sandberg discussed that on The 80,000 Hours Podcast (emphasis mine):
William, I am guessing you would like Anders’ episodes! You can find them searching for “Anders Sandberg” here.
Yea, I found him to be a fascinating person when I talked to him at EAGx Warsaw.
I’m initially sceptical of getting 40% of the mass-energy out of, well, anything. Perhaps I would benefit from reading more on black holes.
However I would in principle agree with the idea that if black holes are feasible power outputers, this would increase the theoretical maximum computation rate.
Hi William, interesting post :) Some reactions:
I think the LW audience may be interested in this as well, so consider cross-posting there
While I don’t think this will change your (pretty unobjectionable) bottomline conclusion that “the limit of computation is far above all humans combined”, I’d be curious to know if you’ve considered other approaches to estimating the theoretical computational limit of the solar system, and how those BOTECs would compare to your current approach (in the spirit of how Ajeya Cotra considered 4 different anchors in her report for estimating training computation requirements for a “transformative” model, and arrived at a pretty wide range)
Same question for estimating the computational ability of all humans alive. In particular you may want to check out Open Philanthropy’s 2020 report How Much Computational Power Does It Take to Match the Human Brain? (see chart below from the report for how widely the numbers can range)
Come to think of it, if the idea is to show that the “limit of computation is far above all humans combined”, you may be interested in a computation efficiency-oriented perspective (e.g. normalizing by power consumption), in which case Robert Freitas’ sentience quotient may interest you (scroll down to the last section). Seth Lloyd’s ultimate laptop calculations may interest you as well
Considering instead the perspective of “ultimate brains”, you may be interested in Anders Sandberg’s The physics of information processing superobjects: daily life among the Jupiter brains, in particular his calculations w.r.t. the physics and engineering constraints guiding the design specs for Dyson brains, Jupiter brains and “neutronium” brains
Happy reading :)
Hi Mo, thanks for the feedback.
Good thought, I’ve cross-posted it to my account there.
This post was spurred by a conversation I had about the upper limit of AI intelligence and the fact that it was likely very far above all humans combined. This is meant as, like you said, a pretty unobjectionable support for my then assumed conclusion. The conversion was heavily influenced by Cotra’s Bioanchors report.
I was estimating the brains computation ability very roughly. I guessed that there would be more detailed estimations already done, but would take time to read through and understand their premises. I’ll read through the document when I have some time.
These two look interesting to read.
Anders Sandberg is an interesting person. I speculated someone had done calculations similar to mine, I’m not surprised that he is one of such people.
Yeah, your initial guess was right that more detailed estimations had already been done. I figured the reason you posted your rough BOTEC was to invite others to refer you to those more detailed estimates (saves time on your end via crowdsourcing too), since I’ve done the same as well in the past, hence the shares :) Happy reading
An efficient idea, good thinking.