The long-term future focuses on possible ways in which the future of humanity may unfold over long timescales.
Bostrom’s typology of possible scenarios
Nick Bostrom has identified four broad possibilities for the future of humanity.[1]
First, humans may go prematurely extinct. Since the universe will eventually become inhospitable, extinction is inevitable in the very long run. However, it is also plausible that people will die out far before this deadline.
Second, human civilization may plateau, reaching a level of technological advancement beyond which no further advancement is feasible.
Third, human civilization may experience recurrent collapse, undergoing repeated declines or catastrophes that prevent it from moving beyond a certain level of advancement.
Fourth, human civilization may advance so significantly as to become nearly unrecognizable. Bostrom conceptualizes this scenario as a “posthuman” era where people have developed significantly different cognitive abilities, population sizes, body types, sensory or emotional experiences, or life expectancies.
Further reading
Baum, Seth D. et al. (2019) Long-term trajectories of human civilization, Foresight, vol. 21, pp. 53–83.
Bostrom, Nick (2009) The future of humanity, in Jan Kyrre Berg Olsen, Evan Selinger & Søren Riis (eds.) New Waves in Philosophy of Technology, London: Palgrave Macmillan, pp. 186–215.
Hanson, Robin (1998) Long-term growth as a sequence of exponential modes, working paper, George Mason University (updated December 2000).
Roodman, David (2020) Modeling the human trajectory, Open Philanthropy, June 15.
Related entries
longtermism | non-humans and the long-term future | space colonization
- ^
Bostrom, Nick (2009) The future of humanity, in Jan Kyrre Berg Olsen, Evan Selinger & Søren Riis (eds.) New Waves in Philosophy of Technology, London: Palgrave Macmillan, pp. 186–215.
I moved the content from the “future of humanity” article, which had been imported from EA Concepts, to this entry, and deleted that article. I don’t think we should give so much prominence to Bostrom’s particular typology, so the entry still requires substantial revision and expansion.
It’s not obvious to me that this tag is being sufficiently understood by the Forum’s distributed tagging mind as being separate from the Longtermism (Philosophy) tag.
Made worse by the fact that at Pablo’s request, I deduplicated the Longtermism (Philosophy) tag, with the Longtermism wiki-entry.
I will now take a look at the posts currently associated with this tag and will make sure they have the correct tag (long-term future or longtermism).