Leopold has now published a popular article discussing this topic. Highly recommended.
An excerpt:
Philosophers like Nick Bostrom, Derek Parfit, and Toby Ord have become increasingly concerned about such so-called “existential risks.” An unrecoverable collapse of civilization wouldn’t just be tragic for the billions who would suffer and die. Perhaps the greatest tragedy would be the foreclosing of all of humanity’s potential. Humanity could flourish for billions of years and enable trillions of happy human lives—if only we do not destroy ourselves beforehand.
This line of thinking has led some to question whether “progress”—in particular, technological progress—is as straightforwardly beneficial as commonly assumed. Nick Bostrom imagines the process of technological development as “pulling balls out of a giant urn.” So far, we’ve been lucky, pulling out a great many “white” balls that are broadly beneficial. But someday, we might pull out a “black” ball: a new technology that destroys humanity. Before that first nuclear test, some of the physicists worried that the nuclear bomb would ignite the atmosphere and end the world. Their calculations ultimately deemed it “extremely unlikely,” and so they proceeded with the test—which, as it turns out, did not end the world. Perhaps the next time, we don’t get so lucky.
The same technological progress that creates these risks is also what drives economic growth. Does that mean economic growth is inherently risky? Economic growth has brought about extraordinary prosperity. But for the sake of posterity, must we choose safe stagnation instead? This view is arguably becoming ever-more popular, particularly amongst those concerned about climate change; Greta Thunberg recently denounced “fairy tales of eternal economic growth” at the United Nations.
I argue that the opposite is the case. It is not safe stagnation and risky growth that we must choose between; rather, it is stagnation that is risky and it is growth that leads to safety.
We might indeed be in “time of perils”: we might be advanced enough to have developed the means for our destruction, but not advanced enough to care sufficiently about safety. But stagnation does not solve the problem: we would simply stagnate at this high level of risk. Eventually, a nuclear war or environmental catastrophe would doom humanity regardless.
Faster economic growth could initially increase risk, as feared. But it will also help us get past this time of perils more quickly. When people are poor, they can’t focus on much beyond ensuring their own livelihoods. But as people grow richer, they start caring more about things like the environment and protecting against risks to life. And so, as economic growth makes people richer, they will invest more in safety, protecting against existential catastrophes. As technological innovation and our growing wealth has allowed us to conquer past threats to human life like smallpox, so can faster economic growth, in the long run, increase the overall chances of humanity’s survival.
Leopold has now published a popular article discussing this topic. Highly recommended.
An excerpt: