Ah, I want to acknowledge that the definition of civilization is quite broad without getting too in the weeds on this point.
I heard the economist Steve Keen describe civilization as ‘harnessing energy to elevate us above the base level of the planet’ (I may be paraphrasing somewhat).
I think this is a pretty good definition, because it also makes it clear why civilization is inherently unstable—and thus fragile—it is, by definition, out of equilibrium with the natural environment.
And any ecologist will know what happens next in this situation—overshoot[1].
So all civilization is inherently fragile, and the larger it grows the more it depletes the carrying capacity of the environment.
Which brings us to industrial/post industrial civilization:
I think the best metaphor for industrial civilization is a rocket—it’s an incredibly powerful channeled explosion that has the potential to take you to space, but also has the potential to explode, and has a finite quantity of fuel.
The ‘fuel’, in the case of industrial civilization is not simply material resources such as oil and coal, but also environmental resources—the complex ecologies that support life on the planet and even the stable, temperate, climate that gave us the opportunity to settle down and form civilization.
Civilization can only form during these tiny little peaks, the interglacial periods. Anthropogenic climate change is far beyond the bounds of this cycle and there is no guarantee that it will return to a cadence capable of supporting future civilizations.
Further, our current level of development was the result of a complex chain of geopolitical events that resulted in a prolonged period of global stability and prosperity.
While it may be possible for future civilizations to achieve some level of technological development, it is incredibly unlikely they will ever have the resources and conditions that enabled us to reach the ‘digital’ tech level.
Consider that even now, under far better conditions than we can expect future civilizations to have, it is still more likely that we’ll destroy ourselves than flourish. That potential for self-destruction is unabated in future civilizations, whereas the potential for flourishing is heavily if not completely depleted.
Replying to myself with an additional contribution I just read that says everything much better than I managed:
In physics terms, the world economy, as well as all of the individual economies within it, are dissipative structures. As such, growth followed by collapse is a usual pattern. At the same time, new versions of dissipative structures can be expected to form, some of which may be better adapted to changing conditions. Thus, approaches for economic growth that seem impossible today may be possible over a longer timeframe.
For example, if climate change opens up access to more coal supplies in very cold areas, the Maximum Power Principle would suggest that some economy will eventually access such deposits. Thus, while we seem to be reaching an end now, over the long-term, self-organizing systems can be expected to find ways to utilize (“dissipate”) any energy supply that can be inexpensively accessed, considering both complexity and direct fuel use.
I would add that while new structures can be expected to form, because they are adapted for different conditions and exploiting different energy gradients, we should not expect them to have the same features/levels of complexity.
Ah, I want to acknowledge that the definition of civilization is quite broad without getting too in the weeds on this point.
I heard the economist Steve Keen describe civilization as ‘harnessing energy to elevate us above the base level of the planet’ (I may be paraphrasing somewhat).
I think this is a pretty good definition, because it also makes it clear why civilization is inherently unstable—and thus fragile—it is, by definition, out of equilibrium with the natural environment.
And any ecologist will know what happens next in this situation—overshoot[1].
So all civilization is inherently fragile, and the larger it grows the more it depletes the carrying capacity of the environment.
Which brings us to industrial/post industrial civilization:
I think the best metaphor for industrial civilization is a rocket—it’s an incredibly powerful channeled explosion that has the potential to take you to space, but also has the potential to explode, and has a finite quantity of fuel.
The ‘fuel’, in the case of industrial civilization is not simply material resources such as oil and coal, but also environmental resources—the complex ecologies that support life on the planet and even the stable, temperate, climate that gave us the opportunity to settle down and form civilization.
Civilization can only form during these tiny little peaks, the interglacial periods. Anthropogenic climate change is far beyond the bounds of this cycle and there is no guarantee that it will return to a cadence capable of supporting future civilizations.
Further, our current level of development was the result of a complex chain of geopolitical events that resulted in a prolonged period of global stability and prosperity.
While it may be possible for future civilizations to achieve some level of technological development, it is incredibly unlikely they will ever have the resources and conditions that enabled us to reach the ‘digital’ tech level.
Consider that even now, under far better conditions than we can expect future civilizations to have, it is still more likely that we’ll destroy ourselves than flourish.
That potential for self-destruction is unabated in future civilizations, whereas the potential for flourishing is heavily if not completely depleted.
https://biologydictionary.net/carrying-capacity/
Replying to myself with an additional contribution I just read that says everything much better than I managed:
Gail Tverberg
I would add that while new structures can be expected to form, because they are adapted for different conditions and exploiting different energy gradients, we should not expect them to have the same features/levels of complexity.
This is highly relevant to your interest in scaling trust:
https://www.lesswrong.com/posts/Fu7bqAyCMjfcMzBah/eigenkarma-trust-at-scale
Yeah :) I’m actually already trying to contribute to that project. Thanks for thinking of me when you saw something relevant though.