The 25 researchers who have published the largest number of academic articles on existential risk

Link post

I created a list of researchers in the existential risk field based on the amount of papers they have published. In the following I provide links to their work. This is not meant as a strict evaluation of the value of their contributions to the field, but more as a quick overview of who is working on what. I hope this is helpful for people who are interested in existential risk studies, be it professionally or on a personal basis to understand the main topics that are currently being worked on. I think this could be especially helpful for people coming new to existential risk research and want to understand who the established organizations, researchers and topics are.

This is based on The Existential Risk Research Assessment (TERRA). TERRA is a website that uses machine learning to find publications that are related to existential risk. It is run by Centre for the Study of Existential Risk (CSER) and was originally launched by Gorm Shackelford. I used their curated list of papers and wrote some code that counted the amount of papers per author. You can find the code here and here are the results:

I am sorry for butchering some names, but this was due to the way I had to strip the strings to make them easily countable. As TERRA is based on the manual assessment of automatically collected papers, this list is likely incomplete, but I still think that it gives us a good overview of what is going on in existential risk studies. If you want to improve the data here, feel free to make an account at TERRA and start assessing papers.

In the following I curated a list of all the top 25 researchers with links to their Google Scholar profile (if I could find it), the main existential risk organization they are affiliated with and a publication where I think that it showcases the kind of existential risk research they do. If you have the impression that some of the people in the list could be better represented by another publication, please let me know and I will change it. 25 is an arbitrary cutoff, this does not mean that the person on 26 is any worse than the person on 25, but I had to stop somewhere. You can find the complete list in the repository.

Here is the list with the links:

  1. Seth Baum

    1. Global Catastrophic Risk Institute (GCRI)

    2. How long until human-level AI? Results from an expert assessment

  2. David Denkenberger

    1. Alliance to Feed the Earth in Disasters (ALLFED)

    2. Feeding everyone: Solving the food crisis in event of global catastrophes that kill crops or obscure the sun

  3. Joshua M. Pearce

    1. Alliance to Feed the Earth in Disasters (ALLFED)

    2. Leveraging Intellectual Property to Prevent Nuclear War

  4. Nick Bostrom

    1. Future of Humanity Institute (FHI)

    2. Superintelligence: Paths, Dangers, Strategies

  5. Roman V. Yampolskiy

    1. University of Louisville

    2. Predicting future AI failures from historic examples

  6. Émile P. Torres

    1. Currently no affiliation, former Centre for the Study of Existential Risk (CSER)

    2. Who would destroy the world? Omnicidal agents and related phenomena

  7. Milan M. Ćirković

    1. Astronomical Observatory of Belgrade

    2. The Temporal Aspect of the Drake Equation and SETI

  8. Bruce Edward Tonn

    1. University of Tennessee

    2. Obligations to future generations and acceptable risks of human extinction

  9. Alan Robock

    1. Rutgers University

    2. Volcanic eruptions and climate

  10. Owen Toon

    1. University of Colorado, Boulder

    2. Environmental perturbations caused by the impacts of asteroids and comets

  11. Jacob Haqq-Misra

    1. Blue Marble Space Institute of Science

    2. The Sustainability Solution to the Fermi Paradox

  12. Luke Kemp

    1. Centre for the Study of Existential Risk (CSER)

    2. Climate Endgame: Exploring catastrophic climate change scenarios

  13. Anders Sandberg

    1. Future of Humanity Institute (FHI)

    2. Converging Cognitive Enhancements

  14. Alexey Turchin

    1. Alliance to Feed the Earth in Disasters (ALLFED)

    2. Classification of global catastrophic risks connected with artificial intelligence

  15. Charles Bardeen

    1. National Center for Atmospheric Research

    2. Extreme Ozone Loss Following Nuclear War Results in Enhanced Surface Ultraviolet Radiation

  16. John Leslie

    1. University of Guelph

    2. Testing the Doomsday Argument

  17. Graciela Chichilnisky

    1. Columbia University

    2. The foundations of probability with black swans

  18. Stuart Armstrong

    1. Future of Humanity Institute (FHI)

    2. Eternity in six hours: Intergalactic spreading of intelligent life and sharpening the Fermi paradox

  19. Paul R. Ehrlich,

    1. Stanford University

    2. Extinction: The Causes and Consequences of the Disappearance of Species

  20. Hin-Yan Liu

    1. University of Copenhagen

    2. Categorization and legality of autonomous and remote weapons systems

  21. Juan B. García Martínez

    1. Alliance to Feed the Earth in Disasters (ALLFED)

    2. Potential of microbial protein from hydrogen for preventing mass starvation in catastrophic scenarios

  22. David Morrison

    1. Ames Research Center

    2. Asteroid and comet impacts: the ultimate environmental catastrophe

  23. R. Grieve

    1. University of Western Ontario

    2. Extraterrestrial impacts on earth: the evidence and the consequences

  24. Richard S.J. Tol

    1. University of Sussex

    2. Why Worry About Climate Change? A Research Agenda

  25. Olle Häggström

    1. Chalmers University of Technology

    2. Artificial General Intelligence and the Common Sense Argument

To also have an overview of the overall output of existential risk organization I also added up the publications for all researchers that are at the same organization. This includes double counts, but I still think it is a good way to give a rough approximation of the overall productivity of the organization:

Sharing my personal reflections while categorizing research on TERRA. It appears that the surge in COVID-19 related papers has started to dwindle, seeming to decrease in 2022 compared to 2021. On the other hand, the interest in AI seems to grow. In 2022, there was a noticeable uptick in AI-related papers compared to previous years. It seemed to me that AI-related papers might even constitute around 50% of the papers I categorized as having existential risk potential in 2022, although this is a rough estimate as I lack access to categorized data.

Turning to a particular concern regarding a lack of representation of women (1 out of 25) in the list. It’s unclear what precisely underlies this issue and how we can address it, but it’s undoubtedly a problem. We need to improve this situation. The community should take steps to make it more welcoming and supportive for women to participate, persist, and excel. For instance, initiatives like mentoring programs specifically aimed at women in the context of existential risks could be a good option here.

I plan to revisit this project in a year’s time and provide an update if the new data shows a significant shift from the one shown here.