Executive summary: This post compiles and analyzes numerous expert-generated lists of potential global and existential catastrophes, highlighting both areas of consensus (e.g., nuclear war, AI, climate change, pandemics) and divergence across institutions, while noting that human choices are central both to risk creation and mitigation; it is a curated resource intended to help others understand how different fields frame and prioritize future threats.
Key points:
Tähtinen et al. (2024) produced the most comprehensive catalog of potential societal crises to date—153 in total—classified across six domains (political, economic, social-cultural-health, technological, legal, and environmental), revealing the breadth and complexity of perceived global threats.
UNDRR (2023) focused specifically on hazards with escalation potential, identifying ten threats—including nuclear war, pandemics, and AI-related risks—that could cascade into existential catastrophes due to characteristics like global scope, irreversibility, and systemic impact.
The World Economic Forum (2025) survey highlights short- and long-term global risks as perceived by experts, with near-term concerns centered on misinformation and conflict, and long-term fears shifting toward climate-related events—while also spotlighting inequality as a highly influential underlying risk driver.
Foundational GCR research (Ord, ÓhÉigeartaigh, Avin, Sepasspour) agrees on key existential threats (e.g., AI, nuclear weapons, pandemics, climate change) and emphasizes humanity’s role in both causing and potentially preventing these outcomes; cascading failures and systemic fragility emerge as critical concerns.
A recent horizon scan (Dal Prá et al., 2024) identifies underexplored but emerging threats like AI-nuclear integration, surveillance regimes, and the collapse of food systems, reflecting experts’ growing attention to interconnected and human-amplified risks.
Policy uptake remains uneven: While some risks like nuclear war receive consistent attention (e.g., at the UN), others—particularly newer technological risks—are underrepresented in global governance frameworks and national risk assessments, with countries varying significantly in scope and coverage.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.
Executive summary: This post compiles and analyzes numerous expert-generated lists of potential global and existential catastrophes, highlighting both areas of consensus (e.g., nuclear war, AI, climate change, pandemics) and divergence across institutions, while noting that human choices are central both to risk creation and mitigation; it is a curated resource intended to help others understand how different fields frame and prioritize future threats.
Key points:
Tähtinen et al. (2024) produced the most comprehensive catalog of potential societal crises to date—153 in total—classified across six domains (political, economic, social-cultural-health, technological, legal, and environmental), revealing the breadth and complexity of perceived global threats.
UNDRR (2023) focused specifically on hazards with escalation potential, identifying ten threats—including nuclear war, pandemics, and AI-related risks—that could cascade into existential catastrophes due to characteristics like global scope, irreversibility, and systemic impact.
The World Economic Forum (2025) survey highlights short- and long-term global risks as perceived by experts, with near-term concerns centered on misinformation and conflict, and long-term fears shifting toward climate-related events—while also spotlighting inequality as a highly influential underlying risk driver.
Foundational GCR research (Ord, ÓhÉigeartaigh, Avin, Sepasspour) agrees on key existential threats (e.g., AI, nuclear weapons, pandemics, climate change) and emphasizes humanity’s role in both causing and potentially preventing these outcomes; cascading failures and systemic fragility emerge as critical concerns.
A recent horizon scan (Dal Prá et al., 2024) identifies underexplored but emerging threats like AI-nuclear integration, surveillance regimes, and the collapse of food systems, reflecting experts’ growing attention to interconnected and human-amplified risks.
Policy uptake remains uneven: While some risks like nuclear war receive consistent attention (e.g., at the UN), others—particularly newer technological risks—are underrepresented in global governance frameworks and national risk assessments, with countries varying significantly in scope and coverage.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.