Astro­nom­i­cal waste

TagLast edit: 12 Jun 2021 10:45 UTC by EA Wiki assistant

Astronomical waste is the loss of potential value resulting from delaying the efficient exploitation of the universe’s resources. The term and the concept expressed by it were introduced by Nick Bostrom in a seminal paper (Bostrom 2003).

The accessible universe is vast, and virtually all of it remains unexploited. The Virgo Supercluster contains stars, and the energy of each star could power computations per second. The human brain can perform about computations per second. Assuming that the morally relevant properties of the brain—such as phenomenal consciousness—supervene on its functional organization, it follows that the universe could support, every second, an amount of value equivalent to that realized in human lives. The moral costs of failing to actualize this potential thus appear to be enormous.

In relative terms, however, the costs may be quite modest. The cosmos has existed for about 10 billion years, so one should not antecedently expect cosmological processes to cause value to decay by more than 1 part in 10 billion or so per year. And the observational evidence appears to be consistent with this prior assessment. The finitude, expansion, and burndown of the universe seem all to be occurring at a slow enough rate as to be in line with the estimate based on the duration of the universe so far (Christiano 2013).

If the opportunity costs of delaying the exploitation of the universe’s resources are so low in relative terms, however large they may be in absolute terms, it follows that such costs are unimportant relative to the costs arising from exposure to existential risk, which are much higher in comparison. Over the next decade, maybe a billionth of total attainable value will be lost as a result of failing to arrange the universe optimally. Over that same decade, perhaps a thousandth of this value will be lost in expectation from exposure to a 0.1 percent risk of an existential catastrophe. The costs from existential risk thus appear to exceed the opportunity costs by several orders of magnitude.

Thus, although upon first noticing the astronomical costs of delayed technological development an altruist may be tempted to conclude that such development should be hastened, that conclusion does not survive careful reflection. Because the bulk of existential risk is posed by anthropogenic existential risks from new technologies, accelerating the development of new technology will itself have major effects on existential risk. Such effects will dwarf any gains from reduction of astronomical waste, and should therefore be the primary consideration for decision making.

A note about terminology

Astronomical waste is often cited as a consideration in favor of longtermism. When authors talk about “astronomical waste” in these contexts, however, what they typically mean by that phrase are not the costs of delayed expansion, but the costs of failed (or flawed) expansion. Thus, Carl Shulman mentions “the expected Astronomical Waste if humanity were rendered extinct by a sudden asteroid impact.” (Shulman 2012) Similarly, linking to Bostrom’s paper, Gwern Branwen writes that “human extinction represents the loss of literally astronomical amounts of utility.” (Branwen 2020) And Siebe Rozendal writes that “Extinction would be an ‘astronomical waste’.” (Rozendal 2019; for further examples, see Kristoffersson 2020 and Dai 2014) The expression astronomical stakes (cf. Wiblin 2016) may be used to express this idea, while reserving astronomical waste for the concept in Bostrom’s original paper.


Bostrom, Nick (2003) Astronomical waste: the opportunity cost of delayed technological development, Utilitas, vol. 15, pp. 308–314.

Branwen, Gwern (2020) Optimal existential risk reduction investment,, May 28.

Christiano, Paul (2013) Astronomical waste, Rational Altruist, April 30.

Dai, Wei (2014) Is the potential astronomical waste in our universe too small to care about?, LessWrong, October 21.

Kristoffersson, David (2020) The ‘far future’ is not just the far future, Effective Altruism Forum, January 16.

Rozendal, Siebe (2019) Eight high-level uncertainties about global catastrophic and existential risk, Effective Altruism Forum, November 28.

Shulman, Carl (2012) Are pain and pleasure equally energy-efficient?, Reflective Disequilibrium, March 24.

Wiblin, Robert (2016) Making sense of long-term indirect effects, Effective Altruism, August 7

Related entries

differential progress | ethics of existential risk | space colonization | speeding up development

Robert Wiblin: Mak­ing sense of long-term in­di­rect effects

EA Global6 Aug 2016 0:40 UTC
8 points
0 comments16 min readEA link

Niel Bow­er­man: Astro­nom­i­cal stakes

EA Global28 Aug 2015 16:14 UTC
6 points
0 comments1 min readEA link

Nick Bostrom: Astro­nom­i­cal stakes

EA Global28 Aug 2015 8:37 UTC
6 points
0 comments1 min readEA link

Astro­nom­i­cal Waste: The Op­por­tu­nity Cost of De­layed Tech­nolog­i­cal Devel­op­ment—Nick Bostrom (2003)

velutvulpes10 Jun 2021 21:21 UTC
9 points
0 comments8 min readEA link
No comments.