Interesting post. I recently asked the question whether anyone had quantified the percent of tasks that computers are superhuman at as a function of time—has anyone?
A lot of the decline in global GDP growth rate is due to lower population growth rate. But it is true that the GDP per capita growth in “frontier” (developed) countries has been falling.
“Electric motors, pumps, battery charging, hydroelectric power, electricity transmission — among many other things — operate at near perfect efficiency (often around 90%).”
This is seriously cherry picked. Anders Sandberg is writing a book called Grand Futures, and he talks about where fundamental limits lie. In almost all cases, we are far away from fundamental limits. For instance, if it is 10°C colder outside than inside, the fundamental limit on efficiency of a heat pump is about 3000% (coefficient of performance of 30). However, we are now at around a coefficient of performance of four. The conversion of solar energy to food is around 0.1% efficient for the typical field, and an order of magnitude lower than that for animal systems. Comminution (breaking up rock) is only about 1% efficient.
This decline is arguably what one would expect given the predicted end of Moore’s law, projected to hit the ceiling around 2025. In particular, in approaching the end of Moore’s law, we should arguably expect progress to steadily decline as we get closer to the ceiling, rather than expecting a sudden slowdown that only kicks in once we reach 2025 (or thereabout).
Progress on other measures has also greatly slowed down, such as processor clock speeds, the cost of hard drive storage (cf. Kryder’s law), and computations per joule (cf. Koomey’s law).
Irreversible computing could be about five orders of magnitude more efficient than what we do now, and reversible computing could be many orders of magnitude more efficient than that.
I recently asked the question whether anyone had quantified the percent of tasks that computers are superhuman at as a function of time—has anyone?
I’m not aware of any. Though I suppose it would depend a lot on how such a measure is operationalized (in terms of which tasks are included).
This is seriously cherry picked.
I quoted that line of Murphy’s as one that provides examples of key technologies that are close to hitting ultimate limits; I didn’t mean to say that they were representative of all technologies. :)
But it’s worth noting that more examples of technologies that cannot be improved that much further are provided in Gordon’s The Rise and Fall of American Growth as well as in Murphy’s article. Another example of a limit we’ve hit is the speed at which we communicate information around Earth, where we also hit the limit (the speed of light) many decades ago. That’s a pretty significant step that cannot be repeated, which of course isn’t so say that there isn’t still much room for progress in many other respects.
On the second point, I was referring to empirical trends we’re observing, as well as a theoretical ceiling within the hardware paradigm that has been dominant for the last several decades. It is quite thinkable — though hardly guaranteed — that other paradigms will eventually dominate the silicon paradigm as we know it, which gives us some reason to believe that we might see even faster growth in the future. But I don’t think it’s a strong reason in light of the recent empirical trends and the not that far-off distance to the ultimate limits to computation.
Interesting post. I recently asked the question whether anyone had quantified the percent of tasks that computers are superhuman at as a function of time—has anyone?
A lot of the decline in global GDP growth rate is due to lower population growth rate. But it is true that the GDP per capita growth in “frontier” (developed) countries has been falling.
This is seriously cherry picked. Anders Sandberg is writing a book called Grand Futures, and he talks about where fundamental limits lie. In almost all cases, we are far away from fundamental limits. For instance, if it is 10°C colder outside than inside, the fundamental limit on efficiency of a heat pump is about 3000% (coefficient of performance of 30). However, we are now at around a coefficient of performance of four. The conversion of solar energy to food is around 0.1% efficient for the typical field, and an order of magnitude lower than that for animal systems. Comminution (breaking up rock) is only about 1% efficient.
Irreversible computing could be about five orders of magnitude more efficient than what we do now, and reversible computing could be many orders of magnitude more efficient than that.
Thanks :)
I’m not aware of any. Though I suppose it would depend a lot on how such a measure is operationalized (in terms of which tasks are included).
I quoted that line of Murphy’s as one that provides examples of key technologies that are close to hitting ultimate limits; I didn’t mean to say that they were representative of all technologies. :)
But it’s worth noting that more examples of technologies that cannot be improved that much further are provided in Gordon’s The Rise and Fall of American Growth as well as in Murphy’s article. Another example of a limit we’ve hit is the speed at which we communicate information around Earth, where we also hit the limit (the speed of light) many decades ago. That’s a pretty significant step that cannot be repeated, which of course isn’t so say that there isn’t still much room for progress in many other respects.
On the second point, I was referring to empirical trends we’re observing, as well as a theoretical ceiling within the hardware paradigm that has been dominant for the last several decades. It is quite thinkable — though hardly guaranteed — that other paradigms will eventually dominate the silicon paradigm as we know it, which gives us some reason to believe that we might see even faster growth in the future. But I don’t think it’s a strong reason in light of the recent empirical trends and the not that far-off distance to the ultimate limits to computation.