Astronomical waste is the loss of potential value from delaying the efficient exploitation of the universe’s resources. The term and the concept expressed by it were introduced by Nick Bostrom in a seminal paper.[1]
The accessible universe is vast, and virtually all of it remains unexploited. The Virgo Supercluster contains stars, and the energy of each star could power computations per second. The human brain can perform about computations per second. Assuming that the morally relevant properties of the brain—such as phenomenal consciousness—supervene on its functional organization, it follows that the universe could support, every second, an amount of value equivalent to that realized in human lives. The moral costs of failing to actualize this potential thus appear to be enormous.
In relative terms, however, the costs may be pretty modest. The cosmos has existed for about 10 billion years, so one should not antecedently expect cosmological processes to cause value to decay by more than 1 part in 10 billion or so per year. And the observational evidence appears to be consistent with this prior assessment. The finitude, expansion, and burndown of the universe seem all to be occurring at a slow enough rate to be in line with the estimate based on the duration of the universe so far.[2]
If the opportunity costs of delaying the exploitation of the universe’s resources are so small in relative terms, however large they may be in absolute terms, it follows that such costs are unimportant relative to the costs arising from exposure to existential risk, which are much higher in comparison. Over the next decade, maybe a billionth of total attainable value will be lost due to failure to arrange the universe optimally. Over that same decade, perhaps a thousandth of this value will be lost in expectation from exposure to a 0.1 percent risk of an existential catastrophe. The costs from existential risk exposure thus appear to exceed the opportunity costs from delayed expansion by several orders of magnitude.
Thus, although upon first noticing the astronomical costs of delayed technological development an altruist may be tempted to conclude that such development should be hastened, that conclusion does not survive careful reflection. Because anthropogenic existential risks from new technologies pose the bulk of existential risk, accelerating the development of new technology will itself have significant effects on existential risk. Such effects will dwarf any gains from reducing astronomical waste, and should therefore be the primary consideration for decision-making.
Terminology
Astronomical waste is often cited as a consideration in favor of longtermism. When authors talk about “astronomical waste” in these contexts, however, what they typically mean by that phrase are not the costs of delayed expansion, but the costs of failed (or flawed) expansion. Thus, Carl Shulman mentions “the expected Astronomical Waste if humanity were rendered extinct by a sudden asteroid impact.”[3] Similarly, linking to Bostrom’s paper, Gwern Branwen writes that “human extinction represents the loss of literally astronomical amounts of utility.”[4] And Siebe Rozendal writes that “Extinction would be an ‘astronomical waste’.”[5][6] The expression astronomical stakes[7][8] may be used to express this idea, while reserving astronomical waste to refer to the opportunity costs of delayed technological development.
Further reading
Bostrom, Nick (2003) Astronomical waste: the opportunity cost of delayed technological development, Utilitas, vol. 15, pp. 308–314.
Christiano, Paul (2013) Astronomical waste, Rational Altruist, April 30.
Related entries
differential progress | ethics of existential risk | space colonization | speeding up development
- ^
Bostrom, Nick (2003) Astronomical waste: the opportunity cost of delayed technological development, Utilitas, vol. 15, pp. 308–314.
- ^
Christiano, Paul (2013) Astronomical waste, Rational Altruist, April 30.
- ^
Shulman, Carl (2012) Are pain and pleasure equally energy-efficient?, Reflective Disequilibrium, March 24.
- ^
Branwen, Gwern (2020) Optimal existential risk reduction investment, Gwern.net, May 28.
- ^
Rozendal, Siebe (2019) Eight high-level uncertainties about global catastrophic and existential risk, Effective Altruism Forum, November 28.
- ^
See also Wei Dai (2014) Is the potential astronomical waste in our universe too small to care about?, LessWrong, October 21, Gregory Lewis (2018) The person-affecting value of existential risk reduction, Effective Altruism Forum, April 13, and David Kristoffersson (2020) The ‘far future’ is not just the far future, Effective Altruism Forum, January 16.
- ^
Bostrom, Nick (2015) Astronomical stakes, Effective Altruism Global, November 25.
- ^
Wiblin, Robert (2016) Making sense of long-term indirect effects, Effective Altruism, August 7.