The thesis that recent (50 year) declines in innovation productivity are best explained by innovation generally getting structurally harder over time does, I think, somewhat overfit the data.
Sketched argument below:
Innovation is cumulative. And in particular new tools create new possibilities for innovation as much as the reverse. So no astronomy without the telescope, no modern medicine without organic chemistry, no Beethoven without the invention of the piano, no early mathematics without Hindu-Arabic numerals, etc.
When the right tool arrives, there is a subsequent explosion of innovation, followed by a slow down.
There is a degree of randomness in these bursts, and the 70 years around the turn of the 19th/20th century was a particularly strong cluster (from the publication of Maxwell’s equations in 1865 to the Trinity nuclear test in 1945). Humanity went from candles and horses to nuclear power, jet engines, eradication of most communicable diseases, electrification, relativity and quantum mechanics, the telephone, early computers, and many others. Art and culture also shifted abruptly and in a very interesting way.
Note that this was an acceleration from the 19th century—innovation doesn’t always get harder.
If the limiting factor is the right tool, rather than people or money, then huge investment in research will lead to drops in productivity in producing fundamental breakthroughs. And the people we call geniuses are just those that get their hands on the tool first (bit like Bill Gates being one of a handful of people to globally able play with computers in their teens).
Post 1970 (?) slowdown in innovation is to some extent a contrast with an exceptional cluster, and and in itself a relative trough.
The big question, it seems to me, is whether AI and ~CRISPR the sorts of fundamental tools that can spark a new acceleration?
Very much enjoyed the post.
The thesis that recent (50 year) declines in innovation productivity are best explained by innovation generally getting structurally harder over time does, I think, somewhat overfit the data.
Sketched argument below:
Innovation is cumulative. And in particular new tools create new possibilities for innovation as much as the reverse. So no astronomy without the telescope, no modern medicine without organic chemistry, no Beethoven without the invention of the piano, no early mathematics without Hindu-Arabic numerals, etc.
When the right tool arrives, there is a subsequent explosion of innovation, followed by a slow down.
There is a degree of randomness in these bursts, and the 70 years around the turn of the 19th/20th century was a particularly strong cluster (from the publication of Maxwell’s equations in 1865 to the Trinity nuclear test in 1945). Humanity went from candles and horses to nuclear power, jet engines, eradication of most communicable diseases, electrification, relativity and quantum mechanics, the telephone, early computers, and many others. Art and culture also shifted abruptly and in a very interesting way.
Note that this was an acceleration from the 19th century—innovation doesn’t always get harder.
If the limiting factor is the right tool, rather than people or money, then huge investment in research will lead to drops in productivity in producing fundamental breakthroughs. And the people we call geniuses are just those that get their hands on the tool first (bit like Bill Gates being one of a handful of people to globally able play with computers in their teens).
Post 1970 (?) slowdown in innovation is to some extent a contrast with an exceptional cluster, and and in itself a relative trough.
The big question, it seems to me, is whether AI and ~CRISPR the sorts of fundamental tools that can spark a new acceleration?