Does climate change deserve more attention within EA?

I have been an 80,000 Hours Podcast listener and active in EA for about eighteen months, and I have shifted from a focus on climate change, to animal welfare, and now to x-risk and s-risk, which seem to be highly promising from an EA perspective. Along the way, I wonder if some parts of EA might have underplayed climate change, and if more engagement and content on the topic could be valuable.

While I was thinking about sustainability and ethics, I was frustrated by how limited the coverage of the topic was in the 80,000 Hours Podcast episodes, so I emailed the team. Rob Wiblin responded and suggested that I write up an EA forum post.

Thanks to Alexrjl, John Bachelor, and David Nash for their suggestions and edits.

Edited 29/​10/​2019 to remove a misquotation of 80,000 Hours, and a few other cases where I want to rectify some over-simplifications.

Summary

While it is true that EA and 80,000 Hours is effective in drawing attention to highly neglected areas, my view is it has unjustly neglected coverage of climate change. There are several reasons why I believe climate change deserves more attention within EA. Firstly, some key opinion-shapers in EA appear to have recently updated towards higher weightings on the severity of climate change. Secondly, though climate change is probably not an existential risk itself, it could be treated as an existential risk factor or multiplier. Thirdly, there are limitations to a crude application of the ITN framework and a short-termist approach to altruism. Fourthly, climate change mitigation and resilience may be more tractable than previously argued. Finally, by failing to show a sufficient appreciation of the severity of climate change, EA may risk losing credibility and alienating potential effective altruists.

Changing perceptions of climate change among key individuals in EA

1. Assessment of climate change in Doing Good Better, 2015

The view taken in this book, foundational to EA, mostly equates climate change to a year of lost growth, and assigns a ‘small but significant risk’ that temperature rises are above 4C.

Will Macaskill: Economists tended to assess climate change as not all that bad. Most estimate that climate change will cost only around 2% of global GDP… The thought that climate change would do the equivalent of putting us back one year economically isn’t all that scary- 2013 didn’t seem that much worse than 2014… So the social cost of one tonne of American’s greenhouse gas emissions is about $670 every year. Again, that’s not a significant cost, but it’s also not the end of the world.
However, this standard economic analysis fails to faithfully use expected value reasoning. The standard analysis looks only at the effects from the most likely scenario: a 2-4C rise in temperature… there is a small but significant risk of a temperature increase that’s much greater than 2-4C.
The IPCC gives more than 5% probability to temperature rises greater than 6C and even acknowledges a small risk of catastrophic climate change of 10C or more. To be clear, I’m not saying that this is at all likely, in fact it’s very unlikely. But it is possible, and if it were to happen, the consequences would be disastrous, potentially resulting in a civilisational collapse. It’s difficult to give a meaningful answer of how bad that would be, but if we think it’s potentially catastrophic, then we need to revise our evaluation of the importance of mitigating climate change. In that case, the true expected social cost of carbon could be much higher than $32 per metric ton, justifying much more extensive efforts to reduce emissions than the estimates the economists first suggested.

The main text, and the later table of cause prioritisation uses the economic cost model and assumes 2-4C of warming, without appreciating the follow-on risks. It seems presumptive to assume that, without action, warming would stay to 2C, as DGB does. Pledges are just things written on paper—history’s taught us that.

This source suggests we’re on for 4.1-4.8C of warming by 2100, so it seems erroneous to assume 2-4C should be our baseline assumption. A reader would walk away from this book thinking that climate change was generally not worth worrying too much because 2-4C is equivalent to a year of lost growth, and the chance of >4C of warming is ‘small but significant’.

2. Toby Ord updated his weighting toward climate change in 2018

The text below is from the Fireside Chat with Toby Ord at EAG 2018. In this, Toby raises that the tail-risks are higher than many people think. It seems that one of the key researchers in EA (Toby) has updated their views on the severity of climate change.

Will Macaskill: Between climate change and nuclear winter, do you think climate change is too neglected by EA?
Toby Ord: Yeah, actually, I think it probably is.…
I think that there is some existential risk remaining from nuclear war and from climate change… I think that the amount of warming that could happen from climate change is really under-appreciated. The tail risk, the chance that the warming is a lot worse than we expect, is really big. Even if you set aside the serious risks of runaway climate change, of big feedbacks from the methane clathrates or the permafrost, even if you set all of those things aside, scientists say that the estimate for if you doubled CO2 in the atmosphere is three degrees of warming. And that’s what would happen if you doubled it.
But if you look at the fine print, they say it’s actually from 1.5 degrees to 4.5 degrees. That’s a huge range. There’s a factor of three between those estimates, and that’s just a 66% confidence interval… They actually think there’s a one in six chance it’s more than 4.5 degrees… I’m actually a lot more worried about it than I was before I started looking into this.

Even assuming that we stay within 2C of first order warming (which we’re not on track to do) then >4.5C of warming has a 17% probability. Given that we are neither on track for 2C of first-order, and because the >4C knock-on risk is so high, then I think Will’s language of ‘a small but significant risk’ of >4C warming does not represent the issue accurately.

3. The 80,000 Hours Podcast Episode 50 has a brief discussion of the impact of climate change on food production.

David Denkenberger: In the coincident extreme weather or multiple breadbasket failure, you have droughts or floods on multiple continents at once. There was a UK government study on this that estimated right now, it might be around 1% chance per year, but with the slow climate change… They were getting more like 80% chance of this century that something like that would happen.
Robert Wiblin: Wow. Okay.

At the time of writing this, the only interview I’m aware of where climate change mitigation is extensively discussed is with Professor Yew-Kwang Ng, where they argue that interventions to reduce emissions are beneficial from an economic and welfare perspective because of the short term gain/​long term harm impact of warming.

4. Climate change impacts welfare more than just delaying economic growth.

In the Doing Good Better cause review, the impacts are mostly equated to just lower growth, plus a tail-risk of civilisational collapse. However, it seems that this doesn’t capture the full brunt of the impacts under the main emissions trajectories. From the Stern Review, source here:

The Stern Review: By 2100, in South Asia and sub-Saharan Africa, up to 145 − 220 million additional people could fall below the $2-a-day poverty line, and every year an additional 165,000 − 250,000 children could die compared with a world without climate change.

Climate change will hit the poorest people in society the hardest, probably increasing inequality, and damage many global supply chains that people rely on—making basic goods harder to access. A recent paper argues that mosquito-born diseases could reach another billion people as the climate warms.

5. Giving What We Can acknowledges a lack of EA-aligned research in this area

GWWC: Specifically, there haven’t been any studies in the past 16 years which quantify the impact of climate change on global health in DALYs and in a per-tonne figure. Producing quantitative estimates of the exact mortality and morbidity impacts of climate change (and of present emissions) is still a relatively neglected area.

6. Implications of climate change are absent from many 80,000 Hours Podcasts discussing future economic growth prospects

In this podcast, Tyler Cowen talks about maximising ‘sustainable economic growth’, with no definition on what sustainability means, despite the Stern Review highlighting the trade-off between short-term and long-term growth. His implication is that we should all grow the economy, rather than reduce GHGs or improve resilience/​adaptation.

The Stern Review: Using the results from formal economic models, the Review estimates that if we don’t act, the overall costs and risks of climate change will be equivalent to losing at least 5% of global GDP each year, now and forever. If a wider range of risks and impacts is taken into account, the estimates of damage could rise to 20% of GDP or more. In contrast, the costs of action – reducing greenhouse gas emissions to avoid the worst impacts of climate change – can be limited to around 1% of global GDP each year.

In the most recent 80,000 Hours Podcast interview (at the time of writing) on improving the world through charter cities, there was no mention of climate change, when adaptation and climate resilience is far behind where it needs to be.

How do these cities with huge populations plan to deal with stresses on global water supplies, floods, and rising temperatures which are impacting cities around the world today?

For more information on how rising sea levels and temperatures will affect societies and cities, I’d recommend this LRB review, and the book ‘Extreme Cities’.

Climate change could multiply very bad outcomes

7. Climate change is better understood as an existential risk factor.

Will Macaskill: Between climate change and nuclear winter, do you think climate change is too neglected by EA?
Toby Ord: Yeah, actually, I think it probably is...
I think the way to think about this is not that war itself, or great power war, is an existential risk, but rather it’s something else, which I call an existential risk factor
I’ve been thinking about all these things in terms of whether they could be existential risks, rather than whether they could lead to terrible situations, which could then lead to other bad outcomes.

In this report, John Halstead views the causality between climate change and other existential risks as difficult to nail down. He argues that more emissions and warming might create destabilisation and nuclear war, but it’s hard to see exactly how.

It seems to me that this ‘existential risk factor’ or ‘existential risk multiplier’ should be far from zero. I would expect that global governments dealing with >4C of warming in 100 years would face many domestic pressures, and that might make it harder to form agreements on disarmament treaties, or nanotechnology protocols, for example.

Another example could be biotechnology—if humanity is adapting to high levels of warming, then this could increase the risk that a hostile group (perhaps displaced by climate change) uses advanced weaponry in a way that could be an existential risk.

In this review article, reference is made to senior US security officials calling climate change a ‘threat multiplier’, with the following quote from Todd Miller in his book, Storming the Wall: Climate Change, Migration and Homeland Security.

More dangerous than climate disruption was the climate migrant. More dangerous than the drought were the people who can’t farm because of the drought. More dangerous than the hurricane were the people displaced by the storm.

Climate change could limit humanity’s potential

Or, if we take existential risks as those limiting humanity’s potential, then conflicts over resources in a world with 5-10C of warming could lead to bad institutional lock-in, over a few hundred years if not longer-term.

For example, the capital city of Mongolia, which has warmed by 2.2C, is not looking like a place with great MCE (moral circle expansion) prospects, or like somewhere that can dedicate lots of resources to x-risk governance.

Climate change could increase S-risk

It is also seems possible to me that climate change rises the chance of s-risk, e.g. through creating global dynamics in the future on the basis that we’ve failed to satisfactorily cooperate in resolving environmental degradation issues.

8. And some academics working on climate change are predicting near-term social collapses.

The quote from below is from a draft paper by Dr Jem Bendell (Professor of Sustainability Leadership and Founder of the Institute for Leadership and Sustainability (IFLAS) at the University of Cumbria. Social collapse from climate change in the next few decades might have a significantly non-zero probability. The VICE write-up references other sustainability professionals who broadly agree.

That synthesis leads to a conclusion there will be a near-term collapse in society with serious ramifications for the lives of readers. The paper reviews some of the reasons why collapse-denial may exist, in particular, in the professions of sustainability research and practice, therefore leading to these arguments having been absent from these fields until now.

I think that even if we disagree with the severity of these conclusions, we need to assign a significantly non-zero probability to some social collapse scenarios in the next few decades, and then act on that basis (e.g. through improvising resilience).

Weaknesses of a crude application of the ITN framework

I think that climate change might have also been neglected because of issues with the ITN framework of cause prioritisation and some intellectual oversight within early EA.

9. The Importance—Tractability—Neglectedness framework is flawed when it assumes the world is static.

As EA has moved from high impact individual philanthropy to thinking about broader social changes, I think it needs to appreciate that the world is dynamic. In project management, you prioritise what you need to do based on the urgency of when to do them. A better framework would include urgency—because if climate change destabilises civilisation, then it’s going to be a lot harder to progress on moral circle expansion, for example. (Thanks to John Bachelor for some thoughts on this, and the idea of ITNU).

And even Will Macaskill’s original analysis doesn’t rule out the possibility that climate change causes social collapse. When we bring in the aspect of time, climate change is a more urgent problem because there is only around 10 years in which to halve emissions in order to avoid dangerous levels of climate change. By contrast, I think the urgency point might change our approach to neglected tropical diseases, as these could still be cured in the future, and advanced technology may make this even easier.

But on current warming trajectories, it looks like civilisation could be very different in 100 years, and possibly much worse, with an increased risk of lock-in. So the sooner we tackle climate change the better.

10. Creative effective altruists could have a big impact in this challenging field.

I think this was well expressed by Christine Peterson.

Christine Peterson: My initial exposure to effective altruism was to some of the earliest documents and the earliest visions, and there was in some of those, there was a very high emphasis on measurement… However, somehow I got the impression that it was over-emphasized, and that effective altruists were perhaps overly focused on measurement, overly focused on near-term goals, and I … My gut reaction was, no, no. You guys are the most intelligent, most ambitious, most energetic. You’re at a time of your life where you don’t have a lot of burdens on you. You don’t, you’re not raising kids yet. Now is not the time to focus on near-term, easy to measure goals. Now is the time to take on the biggest, hardest, most revolutionary things you possibly can, and throw yourselves at them, because some of you will succeed.

To take a practical example, what is the 200 year effect of AMF when there are water shortages and heatwaves in the future, compared to the 200 year effect of a GHG reduction initiative, carbon capture program, and a water desalination plant? This intellectual heritage and the focus on short-term numbers and results influenced by the short-term world of hedge funds, and this approach could be at the detriment of missing out on broader changes.

For this reason, we should be thinking about infrastructure, and long-term changes to society, such as advocating for MCE across all sentient life. Parfit drew a lot of attention to long-term welfare threats from climate change in his work, such as in this talk at Harvard EA.

11. EA has a bias towards intellectually stimulating abstract problems, and climate change is more emotively draining and perhaps more boring than others.

It’s grim to consider how your own neighbourhood is going to have to start rationing water within the next two decades, if not sooner, or to assign the probability of social collapse for where you live in your lifetime from climate change.

If you work on risks from nuclear war, then the odds of this happening next year are quite low, so you can probably go about your life unchanged. But 2019, and the next year, and the next, the world will get increasingly hotter.

Empirical ideas about tackling climate change

12. Climate change reduction could be tractable for many EA readers.

Some recent posts on this forum have shown how competitive applications for top EA orgs can be, and 80,000 Hours now has readership of 3m. While a small share of EA forum and 80,000 Hours readers might go into biological security and AI research, I think a lot can do things to reduce climate change, e.g. through petitions, community work, research, divestment, and many others—so there are lots of points of attack on the problem.

Listeners to 80,000 Hours and people involved in EA could do many things:

  • Work at major energy companies on reducing emissions

  • Develop low-carbon technology

  • Work on policy advocacy in neglected areas

  • Climate modelling

  • Research on reducing the risk of social collapse

  • Risk assessments on flood, heat, fire, and impacts to biodiversity

  • Adaptation and resilience work to adapt to less water, and more efficient agriculture

13. Improved technology could resolve the collective action problems of climate change.

The 80,000 Hours climate change page argues that it could be less tractable because of the free rider problem. With improved monitoring of emissions from space, this may help tackle the free rider problem, and breakthroughs in decarbonising industry could help improve efficiency. If we take an Elinor Ostrom view, we might be able to build the components to resolve the collective action problems with improved tech.

14. There is increasing public interest in climate change.

It is true that climate change is not very neglected, and this gives a good argument against working on it, relative to areas like biorisk. On the upside, the popularity of climate change means that society already wants to dedicate an huge and increasing amount of resources to tackle this problem.

With global record temperatures in 2018 (which seem likely to be yet higher this year), and the BBC changing its guidelines over reporting on climate change (with a David Attenborough-presented documentary coming out on Thursday 18th April), it seems likely to me that there will be huge public interest and resources which can be allocated to this problem.

Even if the highest amount of value of EA is in allocating 0.01% of global resources from people who can be persuaded to think about x-risk /​ MCE/​ etc., then if we can capture some of the resources flowing to tackling climate and use things like the ITN framework, then this could create a huge amount of value. Or, if we can increase the amount humanity’s resources as a whole dedicated to climate change (rather than luxury goods), then this seems highly valuable.

This could also make the social contagion of climate change reduction initiatives very high. With rising temperatures, your 10% GWWC plan to AMF might fail to motivate other people apathetic about climate change in your social circle, but if it was 10% to CATF, CFRN, or Climate Works, then you might be able to inspire many others. Anecdotally, I found that when I Googled “giving what we can”, the most popular autocomplete line was for “climate change”, and the friends who I’ve bought Doing Good Better for set up donations to Cool Earth.

15. Even the wealthiest countries are woefully behind on adaptation, and this could perhaps be tractable.

Perhaps as a result of failing to weight the future correctly, and confusion over the extent of climate change, even some of the richest countries in the world are behind where they need to be to maintain current levels of welfare.

Water shortages are expected in the UK in the next 25 years. However, more resilient global infrastructure (e.g. water, energy, food) plus innovation in this space (e.g. through water desalination) could be highly tractable.

Lack of engagement with climate change could damage EA

16. Individuals coming to EA might be put-off by the lack of engagement with climate change.

In looking for EA material on climate change, I found the GWWC page hadn’t been updated since 2013, and that 80,000 Hours has very little material on 2-4C of warming, and instead chunks up its analysis of climate change into focusing on tail risk. I came to EA trying to work out whether I should focus on climate change or animal welfare, but I’ve been surprised by the lack of detail on climate change, given the huge public interest in it. And this has been frustrating.

Having listened to the 80,000 Hours Podcast, and listened to Parfit’s extinction argument, I agree that it is much more important to work on x-risk (and s-risk), but I wonder whether we are alienating potential EAs by not grappling with this issue.

17. Highlighting that climate change is probably not an extinction risk could reduce fatalism

As before, I agree with John Halstead’s view that is it not an existential risk, and that broadly we can adapt to climate change, and thrive in the long reflection were there no other existential risks. But many of my friends do not agree with this—they would ask what the point of preserving the future if climate change will make it much less positive than our current lives. A BBC article describes a rise in eco-anxiety, and feelings of despondency.

But if we can help craft a narrative about how civilisation can get through its present challenges, then we could have a long and flourishing future. This positivity is one of the many things I love about Parfit and about EA in general. This might also help reduce fatalism and increase the number of people motivated to work on improving the long-term future.

Things I’m uncertain about

  • Whether the scale/​ neglectedness/​ tractability framework should be revisited or replaced with a more dynamic model—ITNU with U for urgency

  • Whether climate change can actually be tackled meaningfully

  • What the level of public support is for climate change mitigation

  • How much traction there has been with donors for climate change

  • Pros/​cons of soft broad EA on climate change, vs narrow EA on x-risk

  • Whether biodiversity loss is a significant long-term problem

  • How well we can adapt to climate change over coming decades

  • How likely is civilisational collapse, or worsening of morals

  • How climate change could impact existential risk

  • Whether climate adaptation could also be potentially high value for EAs

Conclusions

To summarise, I think the importance/​scale of climate change is undervalued within EA, the low neglectedness is a good argument against working on it but does mean there is a lot of available resource, and the tractability of climate change could be higher than thought. When the ITN(U) framework includes urgency in sequencing effective action, this prioritises issues which can shape the ethical trajectories of civilisations.

I agree climate change is not an x-risk, and though EA shouldn’t focus on it, we should probably discuss it a bit more and bring our critical thinking to help solve the problem more efficiently. Not discussing it seems like at best an oversight, and at worst, harmful.

But I’m really unsure about all of this, and would like to learn more.

Thanks for reading and I would appreciate your thoughts, and any recommended reading! Like I said earlier, I’m thinking about where my allocate my time to do the most good. I do see existential risk reduction and the regulation of new technology as extremely high impact, but I wonder whether there’s some more value to be unlocked here in tackling climate change.

Further reading

Climate change

As an x-risk

Taking an EA approach to climate change

  • As before, the Founders Pledge paper on climate change

  • The 80,000 Hours page here (which I discuss above)

  • Elinor Ostrom’s profile, 1990 book Governing the Commons, and Nobel Prize speech

Economics and climate change

  • An exploration of trade-offs between long-term and short-term economic growth

  • The Stern Review summary

Some organisations working on this

I also wrote a short article comparing different ways for individuals to reduce their own carbon footprint.