Will and Rob devote a decent chunk of time to climate change on this 80K podcast, which you might find interesting. One quote from Will stuck with me in particular:
I don’t want there to be this big battle between environmentalism and EA or other views, especially when it’s like it could go either way. It’s like elements of environmentalism which are like extremely in line with what a typical EA would think and then maybe there’s other elements that are less similar [...] For some reason it’s been the case that people are like, “Oh, well it’s not as important as AI”. It’s like an odd framing rather than, “Yes, you’ve had this amazing insight that future generations matter. We are taking these actions that are impacting negatively on future generations. This is something that could make the long run future worse for a whole bunch of different reasons. Is it the very most important thing on the margin to be funding?”
I agree that, as a community, we should make sure we’re up-to-date on climate change to avoid making mistakes or embarassing ourselves. I also think, at least in the past, the attitude towards climate work has been vaguely dismissive. That’s not helpful, though it seems to be changing (cf. the quote above). As others have mentioned, I suspect climate change is a gateway to EA for a lot of altruistic and long-term-friendly people (it was for me!).
As far as direct longtermist work, I’m not convinced that climate change is neglected by EAs. As you mention, climate change has been covered by orgs like 80K and Founders Pledge (disclaimer, I work there). The climate chapter in The Precipice is very good. And while you may be right that it’s a bit naive to just count all climate-related funding in the world when considering the neglectedness of this issue, I suspect that even if you just considered “useful” climate funding, e.g. advocacy for carbon taxes or funding for clean energy, the total would still dwarf the funding for some of the other major risks.
From a non-ex-risk perspective, I agree that more work could be done to compare climate work to work in global health and development. There’s a chance that, especially when considering the air pollution benefits of moving away from coal power, climate work could be competitive here. Hauke’s analysis, which you cite, has huge confidence intervals which at least suggest that the ranking is not obvious.
On the one hand, the great strength of EA is a willingness to prioritize among competing priorities and double down on those where we can have the biggest impact. On the other hand, we want to keep growing and welcoming more allies into the fold. It’s a tricky balancing act and the only way we’ll manage it is through self-reflection. So thanks for bringing that to the table in this post!
I also think, at least in the past, the attitude towards climate work has been vaguely dismissive.
As somewhat of an outsider, this has always been my impression. For example, I expect that if I choose to work in climate, some EAs will infer that I have inferior critical thinking ability.
There’s something about the “gateway to EA” argument that is a bit off-putting. It sounds like “those folks don’t yet understand that only x-risks are important, but eventually we can show them the error of their ways.” I understand that this viewpoint makes sense if you are convinced that your own views are correct, but it strikes me as a bit patronizing. I’m not trying to pick on you in particular, but I see this viewpoint advanced fairly frequently so I wanted to comment on it.
Thanks for sharing that. It’s good to know that that’s how the message comes across. I agree we should avoid that kind of bait-and-switch which engages people under false pretences. Sam discusses this in a different context as the top comment on this post, so it’s an ongoing concern.
I’ll just speak on my own experience. I was focused on climate change throughout my undergrad and early career because I wanted to work on a really important problem and it seemed obvious that thismeant I should work on climate change. Learning about EA was eye-opening because I realized (1) there are other important problems on the same scale as climate change, (2) there are frameworks to help me think about how to prioritize work among them, and (3) it might be even more useful for me to work on some of these other problems.
I personally don’t see climate change as some separate thing that people engage with before they switch to “EA stuff.” Climate change is EA stuff. It’s a major global problem that concerns future generations and threatens civilization. However, it is unique among plausible x-risks in that it’s also a widely-known problem that gets lots of attention from funders, voters, politicians, activists, and smart people who want to do altruistic work. Climate change might be the only thing that’s both an x-risk and a Popular Social Cause.
It would be nice for our climate change message to do at least two thing. First, help people like me, who are searching for the best thing to do with their life and have landed on climate because it’s a Popular Social Cause, discover the range of other important things to work on. Second, help people like you, who, I assume, care about future generations and want to help solve climate change, work in the most effective way possible. I think we can do both in the future, even if we haven’t in the past.
Yeah, I think many groups struggle with the exact boundary between “marketing” and “deception”. Though EAs are in general very truthful, different EAs will still differ both in where they put that boundary and their actual evaluation of climate change, so their final evaluation of the morality of devoting more attention to climate change for marketing purposes will differ quite a lot.
I was arguing elsewhere in this post for more of a strict “say what you believe” policy, but out of curiosity, would you still have that reaction (to the gateway/PR argument) if the EA in question thought that climate change thought was, like, pretty good, not the top cause, but decent? To me that seems a lot more ethical and a lot less patronising.
Thanks for the question as it caused me to reflect. I think it is bad to intentionally misrepresent your views in order to appeal to a broader audience, with the express intention of changing their views once you have them listening to you and/or involved in your group. I don’t think this tactic necessarily becomes less bad based on the degree of misrepresentation involved. I would call this deceptive recruiting. It’s manipulative and violates trust. To be clear, I am not accusing anyone of actually doing this, but the idea seems to come up often when “outsiders” (for lack of a better term) are discussed.
Thanks for your comments and for linking to that podcast.
And while you may be right that it’s a bit naive to just count all climate-related funding in the world when considering the neglectedness of this issue, I suspect that even if you just considered “useful” climate funding, e.g. advocacy for carbon taxes or funding for clean energy, the total would still dwarf the funding for some of the other major risks.
In my post I am arguing for an output metric rather than an input metric. In my opinion, climate change will stop being a neglected topic when we actually manage to start flattening the emissions curve. Until that actually happens, humanity is on course for a much darker future. Do you disagree? Are you arguing that it is better to focus on an input metric (level of funding) and use that to determine whether an area has “enough” attention?
It seems to me that this conception of neglectedness doesn’t help much with cause prioritization. Every problem EAs think about is probably neglected in some global sense. As a civilization we should absolutely do more to fight climate change. I think working on effective climate change solutions is a great career choice; better than, like, 98% of other possible options. But a lot of other factors bear on what the absolute best use of marginal resources is.
In my post I am arguing for an output metric rather than an input metric.
But this doesn’t make any sense. It suggests that if a problem is (a) severe and (b) insuperable, we should pour all our effort into it forever, achieving nothing in the process.
The impact equation in Owen Cotton-Barratt’s Prospecting for Gold might be helpful here. Note that his term for neglectedness (what he calls uncrowdedness) depends only on the amount of (useful) work that has already been done, not the value of a solution or the elasticity of progress with work (i.e. tractability). (We can generalise from “work done” to “resources spent”, where effort is one resource you can spend on a problem.)
Now, you can get into the weeds here with exactly what kinds of work count for the purposes of determining crowdedness (presumably you need to downweight in inverse proportion to how well-directed the work is), but I think even under the strictest reasonable definitions the amount of work that has gone into attacking climate change is “a very great deal”.
I can think of some other arguments you might make, around the shape and scale of the first two terms in Owen’s equation, to argue that marginal work put into climate change is still valuable, but none of them depend on redefining neglectedness.
Thanks for clarifying! I understand the intuition behind calling this “neglectedness”, but it pushes in the opposite direction of how EA’s usually use the term. I might suggest choosing a different term for this, as it confused me (and, I think, others).
To clarify what I mean by “the opposite direction”: the original motivation behind caring about “neglectedness” was that it’s a heuristic for whether low hanging fruit in the field exists. If no one has looked into something, then it’s more likely that there is low hanging fruit, so we should probably prefer domains that are less established . (All other things being equal.)
The fact that many people have looked into climate change but we still have not “flattened the emissions curve” indicates that there is not low hanging fruit remaining. So an argument that climate change is “neglected” in the sense you are using the term is actually an argument that it is not neglected in the usual sense of the term. Hence the confusion from me and others.
Will and Rob devote a decent chunk of time to climate change on this 80K podcast, which you might find interesting. One quote from Will stuck with me in particular:
I agree that, as a community, we should make sure we’re up-to-date on climate change to avoid making mistakes or embarassing ourselves. I also think, at least in the past, the attitude towards climate work has been vaguely dismissive. That’s not helpful, though it seems to be changing (cf. the quote above). As others have mentioned, I suspect climate change is a gateway to EA for a lot of altruistic and long-term-friendly people (it was for me!).
As far as direct longtermist work, I’m not convinced that climate change is neglected by EAs. As you mention, climate change has been covered by orgs like 80K and Founders Pledge (disclaimer, I work there). The climate chapter in The Precipice is very good. And while you may be right that it’s a bit naive to just count all climate-related funding in the world when considering the neglectedness of this issue, I suspect that even if you just considered “useful” climate funding, e.g. advocacy for carbon taxes or funding for clean energy, the total would still dwarf the funding for some of the other major risks.
From a non-ex-risk perspective, I agree that more work could be done to compare climate work to work in global health and development. There’s a chance that, especially when considering the air pollution benefits of moving away from coal power, climate work could be competitive here. Hauke’s analysis, which you cite, has huge confidence intervals which at least suggest that the ranking is not obvious.
On the one hand, the great strength of EA is a willingness to prioritize among competing priorities and double down on those where we can have the biggest impact. On the other hand, we want to keep growing and welcoming more allies into the fold. It’s a tricky balancing act and the only way we’ll manage it is through self-reflection. So thanks for bringing that to the table in this post!
As somewhat of an outsider, this has always been my impression. For example, I expect that if I choose to work in climate, some EAs will infer that I have inferior critical thinking ability.
There’s something about the “gateway to EA” argument that is a bit off-putting. It sounds like “those folks don’t yet understand that only x-risks are important, but eventually we can show them the error of their ways.” I understand that this viewpoint makes sense if you are convinced that your own views are correct, but it strikes me as a bit patronizing. I’m not trying to pick on you in particular, but I see this viewpoint advanced fairly frequently so I wanted to comment on it.
Thanks for sharing that. It’s good to know that that’s how the message comes across. I agree we should avoid that kind of bait-and-switch which engages people under false pretences. Sam discusses this in a different context as the top comment on this post, so it’s an ongoing concern.
I’ll just speak on my own experience. I was focused on climate change throughout my undergrad and early career because I wanted to work on a really important problem and it seemed obvious that this meant I should work on climate change. Learning about EA was eye-opening because I realized (1) there are other important problems on the same scale as climate change, (2) there are frameworks to help me think about how to prioritize work among them, and (3) it might be even more useful for me to work on some of these other problems.
I personally don’t see climate change as some separate thing that people engage with before they switch to “EA stuff.” Climate change is EA stuff. It’s a major global problem that concerns future generations and threatens civilization. However, it is unique among plausible x-risks in that it’s also a widely-known problem that gets lots of attention from funders, voters, politicians, activists, and smart people who want to do altruistic work. Climate change might be the only thing that’s both an x-risk and a Popular Social Cause.
It would be nice for our climate change message to do at least two thing. First, help people like me, who are searching for the best thing to do with their life and have landed on climate because it’s a Popular Social Cause, discover the range of other important things to work on. Second, help people like you, who, I assume, care about future generations and want to help solve climate change, work in the most effective way possible. I think we can do both in the future, even if we haven’t in the past.
Yeah, I think many groups struggle with the exact boundary between “marketing” and “deception”. Though EAs are in general very truthful, different EAs will still differ both in where they put that boundary and their actual evaluation of climate change, so their final evaluation of the morality of devoting more attention to climate change for marketing purposes will differ quite a lot.
I was arguing elsewhere in this post for more of a strict “say what you believe” policy, but out of curiosity, would you still have that reaction (to the gateway/PR argument) if the EA in question thought that climate change thought was, like, pretty good, not the top cause, but decent? To me that seems a lot more ethical and a lot less patronising.
Thanks for the question as it caused me to reflect. I think it is bad to intentionally misrepresent your views in order to appeal to a broader audience, with the express intention of changing their views once you have them listening to you and/or involved in your group. I don’t think this tactic necessarily becomes less bad based on the degree of misrepresentation involved. I would call this deceptive recruiting. It’s manipulative and violates trust. To be clear, I am not accusing anyone of actually doing this, but the idea seems to come up often when “outsiders” (for lack of a better term) are discussed.
Thanks for your comments and for linking to that podcast.
In my post I am arguing for an output metric rather than an input metric. In my opinion, climate change will stop being a neglected topic when we actually manage to start flattening the emissions curve. Until that actually happens, humanity is on course for a much darker future. Do you disagree? Are you arguing that it is better to focus on an input metric (level of funding) and use that to determine whether an area has “enough” attention?
It seems to me that this conception of neglectedness doesn’t help much with cause prioritization. Every problem EAs think about is probably neglected in some global sense. As a civilization we should absolutely do more to fight climate change. I think working on effective climate change solutions is a great career choice; better than, like, 98% of other possible options. But a lot of other factors bear on what the absolute best use of marginal resources is.
But this doesn’t make any sense. It suggests that if a problem is (a) severe and (b) insuperable, we should pour all our effort into it forever, achieving nothing in the process.
The impact equation in Owen Cotton-Barratt’s Prospecting for Gold might be helpful here. Note that his term for neglectedness (what he calls uncrowdedness) depends only on the amount of (useful) work that has already been done, not the value of a solution or the elasticity of progress with work (i.e. tractability). (We can generalise from “work done” to “resources spent”, where effort is one resource you can spend on a problem.)
Now, you can get into the weeds here with exactly what kinds of work count for the purposes of determining crowdedness (presumably you need to downweight in inverse proportion to how well-directed the work is), but I think even under the strictest reasonable definitions the amount of work that has gone into attacking climate change is “a very great deal”.
I can think of some other arguments you might make, around the shape and scale of the first two terms in Owen’s equation, to argue that marginal work put into climate change is still valuable, but none of them depend on redefining neglectedness.
Thanks for clarifying! I understand the intuition behind calling this “neglectedness”, but it pushes in the opposite direction of how EA’s usually use the term. I might suggest choosing a different term for this, as it confused me (and, I think, others).
To clarify what I mean by “the opposite direction”: the original motivation behind caring about “neglectedness” was that it’s a heuristic for whether low hanging fruit in the field exists. If no one has looked into something, then it’s more likely that there is low hanging fruit, so we should probably prefer domains that are less established . (All other things being equal.)
The fact that many people have looked into climate change but we still have not “flattened the emissions curve” indicates that there is not low hanging fruit remaining. So an argument that climate change is “neglected” in the sense you are using the term is actually an argument that it is not neglected in the usual sense of the term. Hence the confusion from me and others.