Climate Change Is Neglected By EA

A year ago Louis Dixon posed the question “Does climate change deserve more attention within EA?”. On May 30th I will be discussing the related question “Is Climate Change Neglected Within EA?” with the Effective Environmentalism group. This post is my attempt to answer that question.

Climate change is an incredibly complex issue where a change in greenhouse gas concentrations is warming the planet, which has a long list of knock on impacts including heatwaves, more intense rainfall, more intense droughts, sea level rise, increased storm surges, increased wildfires, expanding the territory affected by tropical diseases, ocean acidification, shifts in geopolitical power as we transition off fossil fuels.

The consequences of these impacts are happening already and will intensify over the coming decades. Predicting that future is very challenging, and our desire to have some understanding of the future makes it tempting to accept simplified models that often only look at one aspect of the greater problem. It is very difficult to have confidence about any models that we do build, as the world is heading into fundamentally uncharted territory. We have good reason to believe that the impacts of climate change will be severe, quite possibly even catastrophic, and so it is critical that the world takes action.

In this post I will argue the following:

  • Due to the inherent difficulty of predicting the full impacts of climate change, there is limited evidence about exactly what these impacts will be (section 1). EA has drawn questionable conclusions from the evidence that does exist (section 2).

  • EA considers Climate Change as “not neglected” based on the amount of effort already being made, rather than the results achieved (section 3).

  • EA downplays the huge impacts from currently expected levels of climate change (section 4), focusing instead on whether climate change is an x-risk (section 5). This approach risks alienating many people from EA (section 6).

  • Climate change has many different impacts, this is a poor fit for EA which tries to quantify problems using simple models, leading to undervaluing of action on climate change (section 7). The EA model of prioritising between causes doesn’t work for climate change which has a broad and effectively permanent impact (section 8).

The result of all of this is to visibly neglect and undervalue climate change in EA publications. I well demonstrate this in two case studies—a talk from EA 2019 (section 9), and some key EA publications and initiatives (section 10).

EA is not unique for neglecting climate change—humanity as a whole continues to neglect to act on this urgent problem. “Don’t Even Think About It” is an excellent book about the many reasons why this is the case.

Some members of the EA community are already doing great work on climate change. However, in this post I have argued that climate change is generally neglected by EA. This needs to change. EA should visibly promote climate change as one of the most important causes to work on. This should be based on more than just discussion of unlikely x-risk scenarios, but also of discussion of the severe impact of the level of climate change which is predicted to happen in the coming decades. The next 10 years will determine whether a 1.5C world is possible—now is the time for action on climate change.

“The hard part is not wrestling with how bad things could get – it’s understanding how much responsibility we still have to make things better.”—Alex Steffen

1) There is a lack of evidence for the more severe impacts of climate change, rather than evidence that the impacts will not be severe.

The UK government commissioned a 2015 risk assessment on the topic of climate change. It is relevant to quote a section of this report here.

--- (Begin quote from 2015 risk assessment)

The detailed chapters of the same report [AR5] suggest that the impacts corresponding to high degrees of temperature increase are not only relatively unknown, but also relatively unstudied. This is illustrated by the following quotes:

  • Crops: “Relatively few studies have considered impacts on cropping systems for scenarios where global mean temperatures increase by 4ºC or more.”

  • Ecosystems: “There are few field-scale experiments on ecosystems at the highest CO2 concentrations projected by RCP8.5 for late in the century, and none of these include the effects of other potential confounding factors.”

  • Health: “Most attempts to quantify health burdens associated with future climate change consider modest increases in global temperature, typically less than 2ºC.”

  • Poverty: “Although there is high agreement about the heterogeneity of future impacts on poverty, few studies consider more diverse climate change scenarios, or the potential of 4ºC and beyond.”

  • Human security: “Much of the current literature on human security and climate change is informed by contemporary relationships and observation and hence is limited in analyzing the human security implications of rapid or severe climate change.”

  • Economics: “Losses accelerate with greater warming, but few quantitative estimates have been completed for additional warming around 3ºC or above.”

A simple conclusion is that we need to know more about the impacts associated with higher degrees of temperature increase. But in many cases this is difficult. For example, it may be close to impossible to say anything about the changes that could take place in complex dynamic systems, such as ecosystems or atmospheric circulation patterns, as a result of very large changes very far into the future.

--- (End quote from 2015 risk assessment)

2) EA has drawn questionable conclusions from this limited evidence base

I previously reviewed two attempts to compute the cost effectiveness of action on climate change and found that both of these were based on sources which are simultaneously the best available evidence, and also deeply flawed as a basis for making any kind of precise cost effectiveness estimate. I concluded:

One of the central ideas in effective altruism is that some interventions are orders of magnitude more effective than others. There remain huge uncertainties and unknowns which make any attempt to compute the cost effectiveness of climate change extremely challenging. However, the estimates which have been completed so far don’t make a compelling case that mitigating climate change is actually order(s) of magnitude less effective compared to global health interventions, with many of the remaining uncertainties making it very plausible that climate change interventions are indeed much more effective.

This contradicts the conclusion of one of the underlying cost effectiveness analysis which concluded that “Global development interventions are generally more effective than Climate change interventions”.

3) Climate Change is deemed “not neglected” based on the amount of effort already being made, rather than the results achieved

The EA importance, tractability, neglectedness (ITN) framework discounts climate change because it is not deemed to be neglected (e.g. scoring 212 on 80K Hours). I have previously disagreed with this position because it ignores whether the current level of action on climate change is anywhere close to what is actually required to solve the problem (it’s not).

The IPCC predicts that emissions must reach net zero by 2050 to limit warming to 1.5C. However, global emissions continue to increase, apart from short term economic shocks. We are a long way from achieving sustained global reductions in emissions.

Climate change is a uniquely difficult problem to tackle because it is so massively decentralised. Agriculture, transport, building standards, energy, manufacturing, etc—all need to be reinvented to work without emitting CO2 and other greenhouse gasses. There’s no way of avoiding that this will require the work of a very large number of people.

It seems to me that the current application of the ITN framework is akin to arguing during a war against joining the army because there are already lots of soldiers fighting. This argument may make sense at the individual level (will one person really make much difference?) but seems obviously wrong at the population level—the war must be won.

4) EA often ignores or downplays the impact of mainstream climate change, focusing on the tail risk instead

The 80,000 hours “Climate change (extreme risks)” problem profile says:

More extreme scenarios (say, warming of 6 ºC or higher) would likely have very serious negative consequences. Sea levels would rise, crop yields could fall significantly, and there would likely be large water shortages. If we fail to adapt, hundreds of millions of people could die from shortages, conflict, or increased vulnerability to diseases, and billions of people could be displaced.

This profile is focused on these ‘tail’ risks — the chance that the planet will experience extreme warming. Though unlikely, the chance of warming over 6 ºC still seems uncomfortably high. We focus on this possibility, rather than the issue of climate change in general, because preventing the most extreme levels of warming helps prevent the worst possible outcomes.

This frames action on climate change as being focused on preventing the risk of a severe but unlikely outcome. The tail risk is a serious concern but I think it is a mistake to neglect the avoidable harms that will be caused from the levels of climate change which we think are very likely without much more rapid global action.

The 80K Hours problem profile makes no mention of the IPCC SR15 report, our best available evidence of the incremental impact of 1.5 vs 2.0C of climate change, which predicts that hundreds of millions more people will be severely impacted. As stated previously, we have very little reliable evidence about what the quantified impact would be of warming more than 2C, and yet according to Climate Action Tracker, we are heading towards a future of 2.3C − 4.1C of warming.

In 2011 the Royal Society considered the impacts of more severe climate change in a special issue titled “Four degrees and beyond: the potential for a global temperature increase of four degrees and its implications”. This included statements that we could see 4C as soon as the 2060s:

If carbon-cycle feedbacks are stronger, which appears less likely but still credible, then 4°C warming could be reached by the early 2060s in projections that are consistent with the IPCC’s ‘likely range’.

And that a 4C world would be incredibly difficult to adapt to:

In such a 4°C world, the limits for human adaptation are likely to be exceeded in many parts of the world, while the limits for adaptation for natural systems would largely be exceeded throughout the world. Hence, the ecosystem services upon which human livelihoods depend would not be preserved. Even though some studies have suggested that adaptation in some areas might still be feasible for human systems, such assessments have generally not taken into account lost ecosystem services.

The 80K Hours problem profile makes no mention of the concept of a carbon budget—the amount of of carbon which we can emit before we are committed to a particular level of warming. Since 1751 the world has emitted 1500 Gt CO2 and SR15 estimates a remaining carbon budget of 420 Gt CO2 to have a 66% chance of limiting warming to 1.5C. We’re emitting ~40Gt CO2/​year, so we’ll have used this budget up before 2030, and hence committed ourselves to more than 1.5C of warming. Additionally, SR15 predicted that to limit warming to 1.5C global emissions need to have actually declined by 45% by 2030. Global emissions are generally continuing to grow and yet to meet this target they must decline by ~8% a year, every year, globally.

All of the CO2 which is emitted is expected to persist in the atmosphere for centuries unless we can deploy negative emissions technologies at enormous scale. This means that the impacts of climate change are effectively irreversible.

5) EA appears to dismiss climate change because it is not an x-risk

EA has repeatedly asked whether climate change is an x-risk.

  • 2019-12 - EA Global London—Could climate change make Earth uninhabitable for humans? [link]

  • 2019-12 - Linch − 8 things I believe about climate change, including discussion of whether climate change is an x-risk [EA Forum]

  • 2019-06 - Vox—Is climate change an “existential threat” — or just a catastrophic one? [link]

  • 2019-04 - Tessa Alexanian, Zachary Jacobi—Rough Notes for “Second-Order Effects Make Climate Change an Existential Threat” [google doc]

  • 2019-01 - ozymandias—Climate Change Is, In General, Not An Existential Risk [EA Forum]

  • 2018-10 - John Halsted—is Climate an x-risk? [google doc]

  • 2018-03 - John Halsted—Climate change, geoengineering, and existential risk [EA Forum]

This is an important question for the EA community to consider. The answer so far has been that climate change is not likely to be an x-risk. However, this appears to often result in climate change being written off as a worthy cause for EA. Without the x-risk label, it gets stacked up against other more immediate causes, and the better quantified short term interventions like global health are able to provide much more straightforward evidence for supporting them.

6) EA is in danger of making itself a niche cause by loudly focusing on topics like x-risk

My first introduction to EA was reading The Life You Can Save by Peter Singer. This made a compelling case that I had the ability, and the moral responsibility, to do concrete good in the world. The first charity that I started giving regularly to was the Against Malaria Foundation.

My impression over the last few years has been that EA has loudly invested a lot of energy into researching and discussing long term causes, with a particular focus on x-risks. Working to better understand and mitigate x-risks is an important area of work for EA, but my own experience is that it feels at times like EA is not a broad movement of people who want to do the most good in whatever cause they are working on, but a narrow movement of people who want to do the most good by focusing on x-risks. This concern was expressed by Peter Singer in an interview in December 2019:

“I certainly respect those who are working on the long-term future, and existential risk and so on, and I think that is important work, it should continue. But, I’m troubled by the idea that that becomes or is close to becoming the public face of the EA movement. Because I do think that there’s only this much narrower group of people who are likely to respond to that kind of appeal.”

This concern has also been expressed before by Dylan Matthews at Vox, with a response from several other members of the EA community here.

Essentially, there is a chance that a reasoned focus on x-risk has the risk of limiting the growth of EA as a movement to the set of people who can be convinced to be interested in x-risks.

The same arguments apply about the way that climate change is discussed within EA. At a time when many individuals and groups are talking about climate change as one of the most important issues of our time, it is striking that EA often downplays the importance of climate change. I’m certainly not the only person to have this impression of EA, as was evident in some of the comments on “Does climate change deserve more attention within EA?” from last year:

“The biggest issue I have with EA is the lack of attention to climate change. I am supporter and member of the EA but I take issue with the lack of attention to climate change. Add me to the category of people that are turned off from the community because it’s weak stance of climate change.”

“I would like to offer a simple personal note that my focus and energy has turned away from EA to climate change only. I now spend all of my time and energy on climate related matters. Though I still value EA’s approach to charity giving, it has begun to feel like a voice from the past, from a world that no longer exists. This is how it registers with me now.”

“I agree that I fall closer into this camp. Where the action tends to be towards climate change and that immediate threat, while I play intellectual exercises with EA. The focus on animal welfare and eating a vegan diet help the planet and fighting malaria are related to climate change. Other issues such as AI and nuclear war seem far fetched. It’s hard for me to see the impacts of these threats without preying on my fears. While climate change has an impact on my daily life.”

“quickly after discovering EA (about 4 years ago) I got the impression that climate change and threats to biodiversity were underestimated, and was surprised at how little research and discussion there seems to be.”

“Yes, absolutely...the 80K podcast occasionally pays it lip service by saying “we agree with the scientific consensus”, but it doesn’t seem to go much further than that”

It is of course important to acknowledge that climate change was voted as the no. 2 top priority cause in the EA survey 2019. I find this result both reassuring and surprising. My surprise is due to everything that I am talking about in this post.

7) EA tries to quantify problems using simple models, leading to undervaluing of action on climate change

Every extra ton of CO2 in the atmosphere contributes to many effects, including:

  • Changed weather patterns—more intense rain, more intense droughts, more heatwaves

  • Increased wildfires

  • Increased sea levels, higher storm surges, more intense hurricanes

  • Melting glaciers, melting sea ice, reduced snowpack

  • Ocean acidification

All of these effects have complex dynamics that interact with specific geographical and human population features around the world.

To properly weigh up the impact of climate change would require a cost function that we don’t know. All of these effects have direct and indirect impacts on human lives around the world.

They also have impacts beyond those on humans. SR15 predicted that at 2C of warming, 99% of all coral reefs will die. How does EA value that loss? There’s obviously not a single answer, but it is at least true that in both of the previous climate change cost effectiveness analysis, considerations like this were completely absent.

8) EA model of prioritising between causes doesn’t work for climate change which has a broad and effectively permanent impact

A choice between buying a malaria net or a deworming tablet is a relatively simple to model choice. However, regardless of which you choose, the product you buy, and how you deliver that product to the people who need it, will have a carbon footprint. In this way, climate change does not fit in neatly with an EA worldview which promotes the prioritisation of causes so that you can choose the most important one to work on. Global health interventions have a climate footprint, which I’ve never seen accounted for in EA cost effectiveness calculations.

Climate change is a problem which is getting worse with time and is expected to persist for centuries. Limiting warming to a certain level gets harder with every year that action is not taken. Many of the causes compared by EA don’t have the same property. For example, if we fail to treat malaria for another ten years, that won’t commit humanity to live with malaria for centuries to come. However, within less than a decade, limiting warming to 1.5C will become impossible.

9) Case study: Could climate change make Earth uninhabitable for humans?

Consider this talk from EA 2019. I am using this talk as an example rather than because I think my criticisms are unique to this talk.

  • As an EA talk about Climate Change, of course it focuses on whether climate change is an x-risk.

  • At 1:30, the speaker presents three possible mechanisms for climate being an x-risk - (1) Making Earth Uninhabitable, (2) Increasing Risk of Other X-Risks, (3) Contributing to Societal Collapse. The speaker makes it clear that (2) and (3) are likely to be higher impact, but are less tractable to study, and so the talk will focus on (1). This is a classic EA choice which will lead to the discussed risks relating to climate change being substantially reduced.

  • At 4:55, the speaker makes the argument that we are planning to create settlements on the moon and mars, and that surviving on a warmer earth will surely be easier. This kind of technology based reassurance is something I’ve encountered before such as this comment which suggested that a warmer world could be made habitable by “ice vests” or “a system which burns fuel and then uses absorption chilling to cool the body”. This seems to miss the point that the current world climate makes life very comfortable and easy, and that our ability to survive in a more hostile climate doesn’t make that avoidable outcome much more acceptable.

  • In the Q&A, at 21:00, the questioner said “… the mainline expectation is not so bad, is that a fair reading of your view?”. In the speaker’s answer they describe deaths from climate change as being likely to be of the same scale as the number of people who die in traffic accidents. This feels like a really trivialising comparison, and one which I’m highly sceptical about the veracity of. The speaker goes on to frame climate change as being a low priority x-risk. This is a perfect example of the way that EA downplays the severity of the mainline impacts of climate change.

  • At 22:40 the questioner asked “do you feel that it is time now to be mobilising and transforming the economy, or do you feel like the jury is still out on that…” (which seems like an incredibly surprising question at this point in the climate movement—the questioner sounds a bit like a climate skeptic based on this question). In the speaker’s answer they state’s that we can emit as much in the future as we have emitted in the past. This makes it sound like we have plenty of time to address climate change. If we wish to limit warming to 1.5C this is definitely false—as stated above, at current emissions levels we will have spent our remaining carbon budget for a 1.5C world before 2030.

10) Case study: Climate is visibly absent or downplayed within some key EA publications and initiatives

On the front page of https://​​www.effectivealtruism.org/​​ there are seven articles listed. Climate Change isn’t mentioned, but AI, Biosecurity, and Animal Welfare are.

On the linked Read more… page there is a list of Promising Causes listed. Climate Change is not on this list.

In April 2018 Will Macaskill gave a TED talk titled “What are the most important moral problems of our time?”. The answer he presented was 1) Global Health, 2) Factory Farming, 3) Existential Risks. He mentions “Extreme Climate Change” in his list of x-risks and goes on to make the argument that none of these are very likely, but that the moral weight of extinction demands that we work on them. As detailed above, this is the classic lens that EA views climate change through—as an unlikely x-risk, rather than as a pressing problem facing the world today.

There are currently four available Effective Altruism Funds:

None of these directly fund work on climate change. The Long-Term Future fund is currently focused on cause prioritisation and AI risk.

80,000 Hours has a summary of the cause areas they have studied. The first section is “Emerging technologies and global catastrophic risks” and this makes four recommendations. This is unusually good for EA as climate change makes it into the top 4 within this first section. However, as discussed earlier, the problem profile is focussed on the “extreme risks”.

Appendix 1) Timeline of climate related EA publications

In an effort to quantify how much attention climate has received within EA, I have compiled a list of relevant articles and publications from the last few years. This list is certainly not comprehensive but was compiled as part of writing this article.

  • 2020-05 − 80K Hours—Climate Problem Profile [link]

  • 2020-04 - Louis Dixon—Climate Shock by Wagner and Weitzman [EA Forum]

  • 2020-02 - John Halstead, Johannes Ackva, FP—Climate & Lifestyle Report [pdf]

  • 2020-01 - Lous Dixon—What should EAs interested in climate change do? [EA Forum]

  • 2020-01 - EA Survey 2019 - Climate Change is no.2 votes cause [EA Forum]

  • 2019-12 - EA Global London—Could climate change make Earth uninhabitable for humans? [link]

  • 2019-12 - Giving Green launched [link]

  • 2019-12 - Linch − 8 things I believe about climate change, including discussion of whether climate change is an x-risk [EA Forum]

  • 2019-12 - Vox—Want to fight climate change effectively? Here’s where to donate your money. [link]

  • 2019-11 - ImpactMatters launched [link]

  • 2019-11 - Taylor Sloan—Applying effective altruism to climate change [blog part 1, blog part 2]

  • 2019-10 - Martin Hare Robertson—Review of Climate Cost-Effectiveness Analyses [EA Forum]

  • 2019-10 - Martin Hare Robertson—Updated Climate Change Problem Profile [EA Forum]

  • 2019-10 - Hauke Hillebrandt—Global development interventions are generally more effective than Climate change interventions [EA Forum]

  • 2019-09 − 2019-11 - Future of Life—Not Cool, 26 episode podcast [link]

  • 2019-08 - Future of Life—The Climate Crisis as an Existential Threat with Simon Beard and Haydn Belfield [podcast]

  • 2019-08 - Will MacAskill—AMA Response: Do you think Climate is neglected by EA? [EA Forum]

  • 2019-07 - Danny Bressler—How Many People WIll Climate Change Kill? [EA Forum]

  • 2019-07 - kbog—Extinguishing or preventing coal seam fires is a potential cause area [EA Forum]

  • 2019-06 - Vox—Is climate change an “existential threat” — or just a catastrophic one? [link]

  • 2019-05 - John Halsted + others—EA comment thread: Is biodiversity loss an x-risk? [EA Forum]

  • 2019-04 - Louis Dixon—Does climate change deserve more attention within EA? [EA Forum]

  • 2019-04 - Tessa Alexanian, Zachary Jacobi—Rough Notes for “Second-Order Effects Make Climate Change an Existential Threat” [google doc]

  • 2019-03 - LetsFund—Clean Energy Innovation Policy [link]

  • 2019-03 - EA Global London—Toby Ord Fireside chat, incl Climate Tail risk is underappreciated [link]

  • 2019-01 - ozymandias—Climate Change Is, In General, Not An Existential Risk [EA Forum]

  • -> 2019-08 - Pawntoe4 - Critique of Halsted’s doc [EA Forum]

  • 2018-10 - John Halsted—is Climate an x-risk? [google doc]

  • 2018-03 - John Halsted—Climate change, geoengineering, and existential risk [EA Forum]

  • 2018-05 - John Halsted/​FP—Climate Change Cause Area Report [pdf]

  • 2017-12 - CSER—Climate Change and the Worst-Case Scenario [link]

  • 2017-06 - Harvard EA—HOW TO ADDRESS CLIMATE CHANGEWITH EFFECTIVE GIVING - [pdf]

  • 2016-07 - Future of Life—Op-Ed: Climate Change Is the Most Urgent Existential Risk [link]

  • 2016-04 − 80K Hours—Climate Problem Profile [link]

  • 2016-04 - Giving What We Can—Climate Problem Profile [link]

  • 2016-01 - mdahlhausen—Searching for Effective Environmentalism Candidates [blog post]

  • 2015-11 - Effective Environmentalism FB Group Created

  • 2015-11 - Future of Life—Climate Problem Profile [link]

  • 2015-04 - CSER—Climate Change and The Common Good [link]

  • 2013-05 - Open Philanthropy—Anthropogenic Climate Change [link]