An intelligence explosion (sometimes called a technological singularity, or singularity for short) is a hypothesized event in which a sufficiently advanced artificial intelligence rapidly attains superhuman intellectual ability by a process of recursive self-improvement.
Further reading
Bostrom, Nick (2014) Superintelligence: Paths, Dangers, Strategies, Oxford: Oxford University Press.
Chalmers, David J. (2010) The singularity: A philosophical analysis, Journal of Consciousness Studies, vol. 17, pp. 7–65.
Pearce, David (2012) The biointelligence explosion: how recursively self-improving organic robots will modify their own source code and bootstrap our way to full-spectrum superintelligence, in Amnon H. Eden et al. (eds.) Singularity Hypotheses: A Scientific and Philosophical Assessment, Berlin: Springer, pp. 199–238.
Sandberg, Anders (2013) An overview of models of technological singularity, in Max More & Natasha Vita-More (eds.) The Transhumanist Reader: Classical and Contemporary Essays on the Science, Technology, and Philosophy of the Human Future, Malden: Wiley, pp. 376–394.
Vinding, Magnus (2017) A contra AI FOOM reading list, Magnus Vinding’s Blog, December (updated June 2022).
Related entries
AI skepticism | AI takeoff | artificial intelligence | flourishing futures | superintelligence | transformative artificial intelligence