An entire category of risks is undervalued by EA [Summary of previous forum post]

This post is a brief summary of a longer forum post I wrote on systemic cascading risks & their relevance to the long-term future (an EA criticism competition submission).

-------------------------------

I make three strong claims:

(1) The cascading sociopolitical & economic effects of climate change, pandemics, and conflicts are undervalued in the mainstream longtermist community. These systemic cascading risks[1] can be extremely important to the long-term future through shaping the development of powerful technologies in the next 10-30 years.

(2) Institutional resilience is the generalization of the solution to systemic cascading risks. A resilient food, water, energy, and infrastructure nexus are key to ensuring system stability around the necessities to live during a crisis, helping to tractably hedge against all systemic cascading risks at once.

(3) The systemic cascading lens fills the missing gap between current events & longtermism, solving a key question in EA epistemics.

0. What is a systemic cascading risk?

Envisioning our societal structures as a graph with nodes and links, a given systemic cascading risk shocks a subset of societal “nodes” and causes n-th order effects that cascade across systems & magnify in volatile, harmful ways – exploiting underlying systemic flaws and interdependencies.

I subsequently discuss COVID, the Russia-Ukraine war, and climate change as examples of pandemic, conflict, and environmental systemic cascading risks respectively.

Pandemic and Conflict Risk: COVID & Russia-Ukraine War

COVID-19, alongside the Russian invasion of Ukraine, served as a catalyst for food inflation and undermined perceived institutional legitimacy:

High food inflation levels are expected to last until 2024, further testing fragile states reliant on food imports and triggering social distress. Historically, the absence of adequate food, water, and energy drives political instability – e.g. 1977 & 1984 Egyptian and Moroccan bread riots, 1989 Jordanese protests, 2011 Arab Spring.

Environmental Risk: Climate Change

In the next 30 years, anthropogenic climate change is projected to cause ~216 million internal climate migrants due to heat stress/​desertification/​land loss[4], ~150 million displaced by sea level rise, ~5 billion people living in moderately water-stressed areas, risks for multi-breadbasket failure and disruption to food supply chains, and related inflationary and poverty effects.

These stresses on our societal systems are likely to contribute to[5]

1. How are they relevant to the long-term future?

Much of EA focuses on tail-end risks – like whether a conflict could cause nuclear war or climate change as a direct existential risk[7], rather than whether an event could cascade across systems and make them more fragile and susceptible to other compounding risks.

However, it makes sense to try to stabilize the political conditions that technologies mature in. The next 10-30 years are a path-dependent precipice – both in terms of political instability & long-term technological development. The potential interaction between these two factors is dangerous.

Value Lock-In Relevance

By encoding certain values into powerful technologies, one encodes the sociotechnical nature of a very particular time and place.[8]

There may be more likely paths (path dependency) where the society we become post-crisis will likely miss certain values. Political crisis and fear tends to result in anti-democratic, authoritarian, and violent social values; abundance tends to beget altruism and peace.[9] By enabling AGI values development to occur during a volatile, “traumatizing” time of climate- and crisis-driven scarcity & tension, there is a strong possibility of an AGI that locks in values that are misaligned with humanity in general (e.g. authoritarianism); these may be values we are permanently stuck with.

Existential Risk Relevance

Political factors can significantly impact existential risk calculations. Though the likelihood of specific possibilities are tenuous, speculative, and unknown, we may generally see a driving forward of military AI capabilities research due to increased conflict and arms race dynamics[10], harder-to-implement AI governance situation due to international tensions[11], and/​or a multiplying effect on nuclear weapons and bioweapons x-risk.

I contend that allowing political and economic instability to affect existential technologies is a dangerous game to play; systemic cascading risks threaten our ability to develop new technologies safely, competently, and cooperatively.

2. How do we solve systemic cascading risks?

Institutional resilience. The 21st century is revealing how uniquely interconnected and vulnerable our societal systems are; all systemic cascading risks can all be tractably mitigated in tandem by improving system fragility – e.g. by tracking and ensuring the necessities of the commodities to live.

Food, water, energy, and infrastructure (where housing falls under infrastructure) form the nexus of what societies require for survival, giving us a comprehensive framework to target systemic resiliency interventions toward. Political stability rests on securing this nexus.

Tractable interventions may include resilient & emergency food investment, drought monitoring & resilience, climate vulnerability analyses on supply chains, scaling substitutes for vital food and energy sources to build redundancy, reforming land use & regulations, developing fast & cost-effective refugee shelters, and developing a flexible crisis response team[12]. Modeling & scenario analysis around key supply chain interdependencies can also inform intervention efforts (helping to target interventions towards maximally effective areas) and support risk incentives (accurately projecting second-order consequences can incentivize governments and risk-sensitive organizations toward a coordinated systemic reform/​response).

Neglectedness: Climate adaptation[13] and drought monitoring interventions[14] are also relatively neglected by climate capital in the status quo, including by EA climate efforts[15]. Although supply chain resiliency is not neglected broadly, current failure modes indicate there are likely tractable and neglected sub-areas where contingent efforts can produce great value.[16]

Brennan (2016) comments in their blog post Missing Cause Areas that:

There is a significant gap in effective altruism for structural change in between “buy bed nets” and “literally build God”. And while development in Africa is a fiendishly difficult topic, so are wild-animal suffering and preventing existential risk, and effective altruists seem to have mostly approached the latter with an attitude of “challenge accepted”.

Because of how EA meta-cause-areas developed, there currently exists missing layers of nuance between the literal end of the world and current global health & poverty.

The path dependency, cascading systemic risk, and values lock-in frameworks fill that gap, capturing the nuances and subtleties of how societal values and current events can shift technology development and contending that 5-30 year timespan institutional and cultural changes are of great importance.

At its core, the missing link exists because systemic thinking & complexity science are largely overlooked and unexplored in the EA community. The epistemic the EA community embraces – one of evidence-based logic and empiricism – leans heavily on linear, quantifiable, direct effects. Any system of thought that relies on cascading n-th order effects may therefore be largely disregarded and not taken seriously in the community. As qualitative experience, development theories, and academic studies show us – sole linear causation is highly unlikely, and overlooking high-order effects have led to a multitude of failed forecasts and policies. The systemic cascading risk framework helps adequately connect complexity effects with longtermist cause area ranking & provide resolute, tractable solutions to such problems.

  1. ^

    To clarify, I do not intend to discuss systemic cascading GCRs, but rather the broader category of risks of which systemic cascading GCRs may act as a tail-end example of.

  2. ^

    Examples include suspension of rule of law in Tunisia, violent Spanish protests, and South African “Zuma riots” (the worst violence in the country since the end of apartheid). More examples in my longer forum post.

  3. ^

    Examples include an Indonesian palm oil export ban; Malaysian chicken export ban; Indian wheat export ban; Argentinian export cap on corn and wheat. The Sri Lankan debt default, the worst economic crisis since the country’s founding, was followed by violent protests and a Presidential ousting.

  4. ^

    Related sub-factors include water scarcity, lower crop productivity, sea level rise & storm surge, and extreme weather events.

  5. ^

    If you’re interested in more evidence, feel free to check out this section of my longer forum post.

  6. ^

    Freedom House’s 2021 Democracy Under Siege report seems particularly relevant here.

  7. ^

    Only recently has there been a shift in thinking of climate change within the EA community – from an unlikely, unimportant tail-end direct risk to a possible existential risk multiplier.

  8. ^

    Similar analogies have happened to historical technologies encoded with the values of their time – e.g. racist architectural exclusion and car-centric cul-de-sacs and interstate highway systems in the U.S.

    Thus, who and why someone creates technology – and their core values – matter.

    This was inspired by William MacAskill’s What We Owe the Future.

  9. ^

    Due to international lack of resiliency and cooperation, I’d wager the overall set of social values that are practically available to society after climate catastrophe (for example) are likely on average significantly worse and less likely to provide large utility to a large group of people.

  10. ^

    Powerful countries’ militaries – e.g. U.S. DoD – are already preparing counterinsurgency efforts in response to climate terrorism. Armed drone development and applied AI in military intel & decision-making will likely be favored due to cost and effectiveness.

  11. ^

    Sociopolitical tension, the election of politically extreme governments, and the violation of international norms can pose a significant barrier to international cooperation in AGI regulation. Notably, any long-term solution involving AGI governance would likely involve the U.S. and China.

  12. ^

    See Kulveit and Leech’s forum post on emergency response teams and their proposal, ALERT.

  13. ^

    Climate adaptation only makes up of ~5% of all climate finance, including both public and private capital flows.

  14. ^

    54% of WMO members have lacking or inadequate drought warning systems (as of 2021).

  15. ^

    EA paradigms for addressing climate change usually fall under GHG emissions reduction and not resilience, including the 80,000 Hours page and Founder’s Pledge’s Climate Change Fund.

    To the extent EA resilience work exists, it tends to focus on global catastrophic risk (e.g. nuclear war) and not systemic cascading risk – e.g. ALLFED and Open Phil’s grant (May 2020) to Penn State for Research on Emergency Food Resilience.

  16. ^

    For example, the World Bank’s Groundswell report finds that adaptation development, when developed alongside other prevention efforts, can reduce the scale of climate migration by up to 80% – potentially greatly increasing global stability.

  17. ^