Interesting post. I recently asked the question whether anyone had quantified the percent of tasks that computers are superhuman at as a function of timeāhas anyone?
A lot of the decline in global GDP growth rate is due to lower population growth rate. But it is true that the GDP per capita growth in āfrontierā (developed) countries has been falling.
āElectric motors, pumps, battery charging, hydroelectric power, electricity transmission ā among many other things ā operate at near perfect efficiency (often around 90%).ā
This is seriously cherry picked. Anders Sandberg is writing a book called Grand Futures, and he talks about where fundamental limits lie. In almost all cases, we are far away from fundamental limits. For instance, if it is 10°C colder outside than inside, the fundamental limit on efficiency of a heat pump is about 3000% (coefficient of performance of 30). However, we are now at around a coefficient of performance of four. The conversion of solar energy to food is around 0.1% efficient for the typical field, and an order of magnitude lower than that for animal systems. Comminution (breaking up rock) is only about 1% efficient.
This decline is arguably what one would expect given the predicted end of Mooreās law, projected to hit the ceiling around 2025. In particular, in approaching the end of Mooreās law, we should arguably expect progress to steadily decline as we get closer to the ceiling, rather than expecting a sudden slowdown that only kicks in once we reach 2025 (or thereabout).
Progress on other measures has also greatly slowed down, such as processor clock speeds, the cost of hard drive storage (cf. Kryderās law), and computations per joule (cf. Koomeyās law).
Irreversible computing could be about five orders of magnitude more efficient than what we do now, and reversible computing could be many orders of magnitude more efficient than that.
I recently asked the question whether anyone had quantified the percent of tasks that computers are superhuman at as a function of timeāhas anyone?
Iām not aware of any. Though I suppose it would depend a lot on how such a measure is operationalized (in terms of which tasks are included).
This is seriously cherry picked.
I quoted that line of Murphyās as one that provides examples of key technologies that are close to hitting ultimate limits; I didnāt mean to say that they were representative of all technologies. :)
But itās worth noting that more examples of technologies that cannot be improved that much further are provided in Gordonās The Rise and Fall of American Growth as well as in Murphyās article. Another example of a limit weāve hit is the speed at which we communicate information around Earth, where we also hit the limit (the speed of light) many decades ago. Thatās a pretty significant step that cannot be repeated, which of course isnāt so say that there isnāt still much room for progress in many other respects.
On the second point, I was referring to empirical trends weāre observing, as well as a theoretical ceiling within the hardware paradigm that has been dominant for the last several decades. It is quite thinkable ā though hardly guaranteed ā that other paradigms will eventually dominate the silicon paradigm as we know it, which gives us some reason to believe that we might see even faster growth in the future. But I donāt think itās a strong reason in light of the recent empirical trends and the not that far-off distance to the ultimate limits to computation.
Interesting post. I recently asked the question whether anyone had quantified the percent of tasks that computers are superhuman at as a function of timeāhas anyone?
A lot of the decline in global GDP growth rate is due to lower population growth rate. But it is true that the GDP per capita growth in āfrontierā (developed) countries has been falling.
This is seriously cherry picked. Anders Sandberg is writing a book called Grand Futures, and he talks about where fundamental limits lie. In almost all cases, we are far away from fundamental limits. For instance, if it is 10°C colder outside than inside, the fundamental limit on efficiency of a heat pump is about 3000% (coefficient of performance of 30). However, we are now at around a coefficient of performance of four. The conversion of solar energy to food is around 0.1% efficient for the typical field, and an order of magnitude lower than that for animal systems. Comminution (breaking up rock) is only about 1% efficient.
Irreversible computing could be about five orders of magnitude more efficient than what we do now, and reversible computing could be many orders of magnitude more efficient than that.
Thanks :)
Iām not aware of any. Though I suppose it would depend a lot on how such a measure is operationalized (in terms of which tasks are included).
I quoted that line of Murphyās as one that provides examples of key technologies that are close to hitting ultimate limits; I didnāt mean to say that they were representative of all technologies. :)
But itās worth noting that more examples of technologies that cannot be improved that much further are provided in Gordonās The Rise and Fall of American Growth as well as in Murphyās article. Another example of a limit weāve hit is the speed at which we communicate information around Earth, where we also hit the limit (the speed of light) many decades ago. Thatās a pretty significant step that cannot be repeated, which of course isnāt so say that there isnāt still much room for progress in many other respects.
On the second point, I was referring to empirical trends weāre observing, as well as a theoretical ceiling within the hardware paradigm that has been dominant for the last several decades. It is quite thinkable ā though hardly guaranteed ā that other paradigms will eventually dominate the silicon paradigm as we know it, which gives us some reason to believe that we might see even faster growth in the future. But I donāt think itās a strong reason in light of the recent empirical trends and the not that far-off distance to the ultimate limits to computation.