Thanks for writing this. I think this may be a cautionary tale of EA hubris, which we should beware of in the future. GWWC had two interns write reports on climate change presumably over the course of about two months, who had no prior knowledge of the area. The movement then took on the Cool Earth recommendation as though it was definitely the best thing within climate change. (This is not a criticism of the authors of the reports, who I think did a good job given the constraints).
There are a lot of foundations distributing billions of dollars in this area, with grant-makers working full time to find opportunities. Climate change is a very complicated area and it is very hard to know who is diong what. I think it would be surprising if GWWC found the best possible thing given its constraints.
I also think it is an argument forchecking lots of conclusions and premises with relevant experts.
We might want to consider whether we are doing something similar in other areas. e.g. is our approach in development, far future or animal welfare naive according to some experts that we haven’t fulled engaged with yet?
I don’t think I would call this hubris. We all knew that the Cool Earth recommendation was low-confidence. But what else were we going to do? To paraphrase Scott Alexander from another recent community controversy, our probability distribution was wide but centered around Cool Earth.
I do think that that nuance occasionally got lost when doing outreach to people not already very informed about EA, but that’s a different problem. We haven’t solved it, but I feel like that’s because it’s hard, not because nobody’s thought about it.
(One could also argue that outreach to mainstream audiences about EA shouldn’t discuss climate change at all, given its place in the movement, but the temptation to make those mainstream audiences more receptive by talking about something they already care about is strong.)
We all knew that the Cool Earth recommendation was low-confidence.
I just glanced at the part of Doing Good Better that discusses Cool Earth. It doesn’t seem that low-confidence to me, and does seem a bit self-congratulatory/hubristic.
We haven’t solved it, but I feel like that’s because it’s hard, not because nobody’s thought about it.
I think I’ve observed information cascades in Effective Altruism relating to both global poverty and AI risk. The thinking seems to go something like: EA is a great community full of smart people. If a lot of smart people agree on something, it’s probably right. It’s hard to beat a big group of smart people, so it’s probably not worth my time personally to consider this issue in detail, so I will use the opinion of a big group of smart people instead. Then when I offer arguments against some position, the person is like: “I know you’re wrong because a big group of smart people disagrees with you”, without considering my arguments.
Big groups of smart people are great, but only because they have intelligent disagreements in order to come to correct opinions. You should trust an idea because it has survived a lot of diverse challenges, not because a lot of people profess it (and especially not because a lot of people in your social circle profess it, since social circles are self-selected). IMO, if you aren’t personally knowledgable regarding a diverse set of challenges an idea has survived, you shouldn’t confidently say that idea is correct, otherwise you are part of the problem. Instead you can say: “some people I trust believe X”.
If I’m right, fixing this problem just requires talking about it more. I’m not saying it is a huge problem, but I think I’d prefer that people discussed it a bit more on the margin.
I don’t think nobody delved into the Cool Earth numbers because they assumed a bunch of smart people had already done it. I think nobody delved into the Cool Earth numbers because it wasn’t worth their time, because climate change charities generally aren’t competitive with the standard EA donation opportunities, so the question is only relevant if you’ve decided for non-EA reasons that you’re going to focus on climate change. (Indeed, if I understand correctly the Founders Pledge report was written primarily for non-EA donors who’d decided this.)
Whatever’s been going on with global poverty and AI risk, I think it’s probably a different problem.
(And yes, Doing Good Better was part of what I was referring to with respect to nuance getting lost in popularizations. It’s that problem specifically that I claim is difficult, not the more general problem of groupthink within EA.)
I think nobody delved into the Cool Earth numbers because it wasn’t worth their time, because climate change charities generally aren’t competitive with the standard EA donation opportunities
This claim seems exactly what people felt was too hubristic—how could anyone be so confident on the basis of a quick survey of such a complex area that climate didn’t match up to other donation opportunities?
I actually happen to think that the report was too dismissive of more leveraged climate change interventions that I expected could be a lot better than the estimates for Cool Earth (especially efficient angles on scientific research and political activity in the climate space), but the OP is suggesting that the original Cool Earth numbers (which indicate much lower cost-effectiveness than charities recommended by EAs in other areas with more robust data) were overstated, not understated (as the original report would suggest due to regression to the mean and measurement error).
I actually think the best climate charities are better than a lot of other things EAs donate to. From a long-termist pont of view, it looks better than global health, zoning in San Fran, macroeconomic policy, criminal justice reform, and animal welfare. I also think funding ordinary climate policy orgs is a better bet than funding solar geoengineering research, which Open Phil has done. Climate change of >6 degrees is very much on the cards, and this would do tremendous damage in the long-term.
Donations in other long-termist areas, like EA field building, AI, bio and nuclear security are probably better, but I think more could be going into the really outstanding climate organisations out there. I’m aware of at least four really good climate policy orgs whose funding gaps I would like to see filled.
“I think nobody delved into the Cool Earth numbers because it wasn’t worth their time, because climate change charities generally aren’t competitive with the standard EA donation opportunities”
I’ve heard this a lot on EA circles, but I’m not sure why. I tried to compare the Coalition for Rainforest Nations and the Against Malaria Foundation in terms of lives saved and could barely start. It’s just too uncertain and complex for me—I don’t know how to translate CO2 reductions into lives saved, although I’m certain that climate change will kill people. Has anyone even tried to do this kind of comparison? A link would be appreciated.
John Broome has also tried to create a conversion factor from DALYs to CO2. I don’t think any particular estimate is credible. Estimates of the social cost of carbon are for the most part completely made up, unmoored from information on impacts. It’s also very hard because I think probably most of the costs of climate change are very indirect and highly uncertain, stemming from the political risks of unprecedented mass migration
There are two ways for climate change reduction to be considered effective by EA frameworks: long-term future and saving lives/improving utility in the presentish generation. There is some discussion here about long-term future. For saving lives, I agree it is tricky. When I attempted this in 2005, I tried to do it based on increased utility. Even though it is true that climate change will likely fall disproportionately on less-developed countries, when you look at the actual economic impacts, they accrue mostly to richer people because they make up the majority of the economy. This is especially true in the longer term, when it is likely that even current less-developed countries will be significantly richer than today. For typical cost climate interventions, I was getting they are about 2.5 orders of magnitude lower cost effectiveness than direct global poverty interventions. Another attempt is here (though you may not agree with his discounting). If Cool Earth really is significantly lower cost, of course that would improve the comparison. But I still think it is very unlikely to be better than direct global poverty interventions.
One thing to emphasize more than that writeup did, is that in EA terms donating to such a lightly researched intervention (a few months work) is very likely dominated by donations to better research the area, finding higher expected value options and influencing others.
On the other hand, the point estimates in that report favored other charities like AMF over Cool Earth anyway, a conclusion strengthened by the OP critique (not that it excludes something else orders of magnitude better being found like unusual energy research, very effective political lobbying, geoengineering, etc; Open Philanthropy has made a few climate grants that look relatively leveraged).
And I agree with John Maxwell about it being oversold in some cases.
Thanks for writing this. I think this may be a cautionary tale of EA hubris, which we should beware of in the future. GWWC had two interns write reports on climate change presumably over the course of about two months, who had no prior knowledge of the area. The movement then took on the Cool Earth recommendation as though it was definitely the best thing within climate change. (This is not a criticism of the authors of the reports, who I think did a good job given the constraints).
There are a lot of foundations distributing billions of dollars in this area, with grant-makers working full time to find opportunities. Climate change is a very complicated area and it is very hard to know who is diong what. I think it would be surprising if GWWC found the best possible thing given its constraints.
I also think it is an argument forchecking lots of conclusions and premises with relevant experts.
We might want to consider whether we are doing something similar in other areas. e.g. is our approach in development, far future or animal welfare naive according to some experts that we haven’t fulled engaged with yet?
I don’t think I would call this hubris. We all knew that the Cool Earth recommendation was low-confidence. But what else were we going to do? To paraphrase Scott Alexander from another recent community controversy, our probability distribution was wide but centered around Cool Earth.
I do think that that nuance occasionally got lost when doing outreach to people not already very informed about EA, but that’s a different problem. We haven’t solved it, but I feel like that’s because it’s hard, not because nobody’s thought about it.
(One could also argue that outreach to mainstream audiences about EA shouldn’t discuss climate change at all, given its place in the movement, but the temptation to make those mainstream audiences more receptive by talking about something they already care about is strong.)
I just glanced at the part of Doing Good Better that discusses Cool Earth. It doesn’t seem that low-confidence to me, and does seem a bit self-congratulatory/hubristic.
I think I’ve observed information cascades in Effective Altruism relating to both global poverty and AI risk. The thinking seems to go something like: EA is a great community full of smart people. If a lot of smart people agree on something, it’s probably right. It’s hard to beat a big group of smart people, so it’s probably not worth my time personally to consider this issue in detail, so I will use the opinion of a big group of smart people instead. Then when I offer arguments against some position, the person is like: “I know you’re wrong because a big group of smart people disagrees with you”, without considering my arguments.
Big groups of smart people are great, but only because they have intelligent disagreements in order to come to correct opinions. You should trust an idea because it has survived a lot of diverse challenges, not because a lot of people profess it (and especially not because a lot of people in your social circle profess it, since social circles are self-selected). IMO, if you aren’t personally knowledgable regarding a diverse set of challenges an idea has survived, you shouldn’t confidently say that idea is correct, otherwise you are part of the problem. Instead you can say: “some people I trust believe X”.
If I’m right, fixing this problem just requires talking about it more. I’m not saying it is a huge problem, but I think I’d prefer that people discussed it a bit more on the margin.
I don’t think nobody delved into the Cool Earth numbers because they assumed a bunch of smart people had already done it. I think nobody delved into the Cool Earth numbers because it wasn’t worth their time, because climate change charities generally aren’t competitive with the standard EA donation opportunities, so the question is only relevant if you’ve decided for non-EA reasons that you’re going to focus on climate change. (Indeed, if I understand correctly the Founders Pledge report was written primarily for non-EA donors who’d decided this.)
Whatever’s been going on with global poverty and AI risk, I think it’s probably a different problem.
(And yes, Doing Good Better was part of what I was referring to with respect to nuance getting lost in popularizations. It’s that problem specifically that I claim is difficult, not the more general problem of groupthink within EA.)
This claim seems exactly what people felt was too hubristic—how could anyone be so confident on the basis of a quick survey of such a complex area that climate didn’t match up to other donation opportunities?
I actually happen to think that the report was too dismissive of more leveraged climate change interventions that I expected could be a lot better than the estimates for Cool Earth (especially efficient angles on scientific research and political activity in the climate space), but the OP is suggesting that the original Cool Earth numbers (which indicate much lower cost-effectiveness than charities recommended by EAs in other areas with more robust data) were overstated, not understated (as the original report would suggest due to regression to the mean and measurement error).
I actually think the best climate charities are better than a lot of other things EAs donate to. From a long-termist pont of view, it looks better than global health, zoning in San Fran, macroeconomic policy, criminal justice reform, and animal welfare. I also think funding ordinary climate policy orgs is a better bet than funding solar geoengineering research, which Open Phil has done. Climate change of >6 degrees is very much on the cards, and this would do tremendous damage in the long-term.
Donations in other long-termist areas, like EA field building, AI, bio and nuclear security are probably better, but I think more could be going into the really outstanding climate organisations out there. I’m aware of at least four really good climate policy orgs whose funding gaps I would like to see filled.
Really very great answer.
“I think nobody delved into the Cool Earth numbers because it wasn’t worth their time, because climate change charities generally aren’t competitive with the standard EA donation opportunities”
I’ve heard this a lot on EA circles, but I’m not sure why. I tried to compare the Coalition for Rainforest Nations and the Against Malaria Foundation in terms of lives saved and could barely start. It’s just too uncertain and complex for me—I don’t know how to translate CO2 reductions into lives saved, although I’m certain that climate change will kill people. Has anyone even tried to do this kind of comparison? A link would be appreciated.
John Broome has also tried to create a conversion factor from DALYs to CO2. I don’t think any particular estimate is credible. Estimates of the social cost of carbon are for the most part completely made up, unmoored from information on impacts. It’s also very hard because I think probably most of the costs of climate change are very indirect and highly uncertain, stemming from the political risks of unprecedented mass migration
There are two ways for climate change reduction to be considered effective by EA frameworks: long-term future and saving lives/improving utility in the presentish generation. There is some discussion here about long-term future. For saving lives, I agree it is tricky. When I attempted this in 2005, I tried to do it based on increased utility. Even though it is true that climate change will likely fall disproportionately on less-developed countries, when you look at the actual economic impacts, they accrue mostly to richer people because they make up the majority of the economy. This is especially true in the longer term, when it is likely that even current less-developed countries will be significantly richer than today. For typical cost climate interventions, I was getting they are about 2.5 orders of magnitude lower cost effectiveness than direct global poverty interventions. Another attempt is here (though you may not agree with his discounting). If Cool Earth really is significantly lower cost, of course that would improve the comparison. But I still think it is very unlikely to be better than direct global poverty interventions.
One thing to emphasize more than that writeup did, is that in EA terms donating to such a lightly researched intervention (a few months work) is very likely dominated by donations to better research the area, finding higher expected value options and influencing others.
On the other hand, the point estimates in that report favored other charities like AMF over Cool Earth anyway, a conclusion strengthened by the OP critique (not that it excludes something else orders of magnitude better being found like unusual energy research, very effective political lobbying, geoengineering, etc; Open Philanthropy has made a few climate grants that look relatively leveraged).
And I agree with John Maxwell about it being oversold in some cases.