Thanks for the reply. As I recently commented on a different post, engagement with commenters is a crucial part of a post like this, and I’m glad you’re doing that even though a lot of the response has been negative (and some of it has been mean, which I don’t support). That isn’t easy.
I think it’s worth emphasizing that the title of this post is “Climate Change Is Neglected By EA”, rather than “Climate Change Is Ignored By EA”, or “Climate Change Is the Single Most Important Cause Above All Others”. I am strongly in favor of cause-impartial prioritisation.
This sounds good to me. However, you don’t actually give much indication indication in the post about how you think climate change stacks up compared to other cause areas. Though you do implicitly do so here:
In “Updated Climate Change Problem Profile” I argued that Climate Change should receive an overall score of 24 rather than 20. That’s a fairly modest increase.
For comparison, on 80K’s website right now, AI risk, global priorities research and meta-EA are currently at 26, biosecurity and ending factory farming are at 23, and nuclear security and global health are at 21. So your implicit claim is that, on the margin, climate change is less important than AI and GPR, slightly more important than biosecurity and farmed animal welfare, and much more important than nuclear security and global health (of the bednets and deworming variety). Does that sound right to you? That isn’t a gotcha, I am genuinely asking, though I do think some elaboration on the comparisons would be valuable.
I don’t agree with this “direct tension”. I’m arguing that (A) Climate Change really is more important than EA often makes it out to be, and that (B) EA would benefit from engaging with people about climate change from an EA perspective. Perhaps as part of this engagement you can encourage them to also consider other causes. However, starting out from an EA position which downplays climate change is both factually wrong and alienating to potential EA community members.
I think I’m sticking to the “direct tension” claim. If oodles of smart, motivated young people are super-excited about climate change work, a decent chunk of them will end up doing climate change work. We’re proposing that these are the sorts of people who would otherwise make a good fit for EA, so we can assume they’re fairly smart and numerate. I’d guess they’d have less impact working on climate change outside EA than within it, but they won’t totally waste their time. So if there are lots of these people, then lots of valuable climate change work will be done with or without EA’s involvement. Conversely, if there aren’t lots of these people (which seems false), the fact that we’re alienating (some of) them by not prioritising climate change isn’t a big issue.
I think you could argue that climate change work remains competitive with other top causes despite not being neglected (I’m sceptical, but it wouldn’t astound me if this were the case). I think you could argue that the gains in recruitment from small increases in perceived openness to climate change work are worth it, despite climate change not being neglected (this is fairly plausible to me). But I don’t think you can simultaneously argue that climate change is badly neglected, and that we’re alienating loads of people by not focusing on it.
For comparison, on 80K’s website right now, AI risk, global priorities research and meta-EA are currently at 26, biosecurity and ending factory farming are at 23, and nuclear security and global health are at 21. So your implicit claim is that, on the margin, climate change is less important than AI and GPR, slightly more important than biosecurity and farmed animal welfare, and much more important than nuclear security and global health (of the bednets and deworming variety). Does that sound right to you? That isn’t a gotcha, I am genuinely asking, though I do think some elaboration on the comparisons would be valuable.
That does sound about right to me.
If oodles of smart, motivated young people are super-excited about climate change work, a decent chunk of them will end up doing climate change work. We’re proposing that these are the sorts of people who would otherwise make a good fit for EA, so we can assume they’re fairly smart and numerate. I’d guess they’d have less impact working on climate change outside EA than within it, but they won’t totally waste their time. So if there are lots of these people, then lots of valuable climate change work will be done with or without EA’s involvement. Conversely, if there aren’t lots of these people (which seems false), the fact that we’re alienating (some of) them by not prioritising climate change isn’t a big issue.
My claim is that EA currently (1) downplays the impact of climate change (e.g. focusing on x-risk, downplaying mainstream impacts) and (2) downplays the value of working on climate change (e.g. low neglectedness, low tractability). If you agree that (1, 2) are true, then EA is misleading its members about climate change and biasing them to work on other issues.
Perhaps I have misunderstood your argument, but I think you’re saying that (1, 2) don’t matter because lots of people already care about climate change, so EA doesn’t need to influence more people to work on climate change. I would argue that regardless of how many people already care about climate change, EA should seek to communicate accurately about the impact and importance of work on different cause areas.
Could you elaborate a bit on why? This doesn’t sound insane to me, but it is a pretty big disagreement with 80,000 Hours, and I am more sympathetic to 80K’s position on this.
My claim is that EA currently (1) downplays the impact of climate change (e.g. focusing on x-risk, downplaying mainstream impacts) and (2) downplays the value of working on climate change (e.g. low neglectedness, low tractability). If you agree that (1, 2) are true, then EA is misleading its members about climate change and biasing them to work on other issues.
Perhaps I have misunderstood your argument, but I think you’re saying that (1, 2) don’t matter because lots of people already care about climate change, so EA doesn’t need to influence more people to work on climate change. I would argue that regardless of how many people already care about climate change, EA should seek to communicate accurately about the impact and importance of work on different cause areas.
My claim is that the fact that so many (smart, capable) people care about climate change work directly causes it to have lower expected value (on the margin). The “impact and importance of work on different cause areas” intimately depends on how many (smart, capable) people are already working or planning to work in those areas, so trying to communicate that impact and importance without taking into account “how many people already care” is fundamentally misguided.
The claim that climate change is a major PR issue for EA, if true, is evidence that EA’s position on climate change is (in at least this one respect) correct.
The claim that climate change is a major PR issue for EA, if true, is evidence that EA’s position on climate change is (in at least this one respect) correct.
I’d like to extend my previous model to have three steps:
(1) EA downplays the impact of climate change (e.g. focusing on x-risk, downplaying mainstream impacts).
(2) EA downplays the value of working on climate change (e.g. low neglectedness, low tractability).
(3) EA discourages people from working on climate change in favor of other causes.
I think you are arguing that since lots of people already care about climate change, (3) is a sensible outcome for EA. To put this more explicitly, I think you are supportive of (2), likely due to you perceiving it as being not neglected. As I’ve stated in this post, I think it is possible to argue that climate change is not actually neglected. In fact you suggested below some good arguments for this:
“Even though lots of work has gone into solving climate change, the problem is so vastly complex and multidimensional that there’s still lots of low-hanging fruit left unpicked, so tractability remains high (and scale is large, so the two together are sufficient for impact).”
“Although lots of work has gone into solving climate change, partial solutions aren’t very valuable: most of the impact comes from the last few % of solving the problem. Also [for some reason] we can’t expect future work to continue at the same rate as current work, so more marginal work now is especially valuable.”
“While lots of resources have already gone into solving climate change, the problem is actually getting bigger all the time! So even though neglectedness is low and falling, returns from scale are high and rising, so marginal work on the problem remains valuable.”
But let’s put that to one side for a moment, because we haven’t talked about (1) yet. Even if you think the best conclusion for EA to make is (3), I still think it’s important that this conclusion is visibly drawn from the best possible information about the expected impacts of climate change. Sections (1), (4), (5), and (7) in my post speak directly to this point.
I look at the way that EA talks about climate change and I think it misses some important points (particularly see section (4) of my post). These gaps in EA’s approach to climate change cause me to have lower trust in EA cause prioritization, and at the more extreme end make me think “EA is the community who don’t seem to care as much about climate change—they don’t seem to think the impact will be so bad”. I think that’s a PR issue for EA.
In fact you suggested below some good arguments for this
Two things:
I made those up on the spur of the moment. Possibly you’re just being polite, but I would be very suspicious if all three turned out to be good arguments supporting work on climate change. I said immediately below that I don’t especially believe any of them in the case of climate change.
More importantly, the whole point of coming up with those arguments was that they didn’t depend on claims about neglectedness! None of those are arguments that climate change is neglected, they are potential shapes of arguments for why you might want to prioritise it despite it not being neglected.
I feel like we’re still not connecting regarding the basic definition of neglectedness. You seem to be mixing it up with scale and tractability in a way that isn’t helpful to precise communication.
Thanks for the reply. As I recently commented on a different post, engagement with commenters is a crucial part of a post like this, and I’m glad you’re doing that even though a lot of the response has been negative (and some of it has been mean, which I don’t support). That isn’t easy.
This sounds good to me. However, you don’t actually give much indication indication in the post about how you think climate change stacks up compared to other cause areas. Though you do implicitly do so here:
For comparison, on 80K’s website right now, AI risk, global priorities research and meta-EA are currently at 26, biosecurity and ending factory farming are at 23, and nuclear security and global health are at 21. So your implicit claim is that, on the margin, climate change is less important than AI and GPR, slightly more important than biosecurity and farmed animal welfare, and much more important than nuclear security and global health (of the bednets and deworming variety). Does that sound right to you? That isn’t a gotcha, I am genuinely asking, though I do think some elaboration on the comparisons would be valuable.
I think I’m sticking to the “direct tension” claim. If oodles of smart, motivated young people are super-excited about climate change work, a decent chunk of them will end up doing climate change work. We’re proposing that these are the sorts of people who would otherwise make a good fit for EA, so we can assume they’re fairly smart and numerate. I’d guess they’d have less impact working on climate change outside EA than within it, but they won’t totally waste their time. So if there are lots of these people, then lots of valuable climate change work will be done with or without EA’s involvement. Conversely, if there aren’t lots of these people (which seems false), the fact that we’re alienating (some of) them by not prioritising climate change isn’t a big issue.
I think you could argue that climate change work remains competitive with other top causes despite not being neglected (I’m sceptical, but it wouldn’t astound me if this were the case). I think you could argue that the gains in recruitment from small increases in perceived openness to climate change work are worth it, despite climate change not being neglected (this is fairly plausible to me). But I don’t think you can simultaneously argue that climate change is badly neglected, and that we’re alienating loads of people by not focusing on it.
That does sound about right to me.
My claim is that EA currently (1) downplays the impact of climate change (e.g. focusing on x-risk, downplaying mainstream impacts) and (2) downplays the value of working on climate change (e.g. low neglectedness, low tractability). If you agree that (1, 2) are true, then EA is misleading its members about climate change and biasing them to work on other issues.
Perhaps I have misunderstood your argument, but I think you’re saying that (1, 2) don’t matter because lots of people already care about climate change, so EA doesn’t need to influence more people to work on climate change. I would argue that regardless of how many people already care about climate change, EA should seek to communicate accurately about the impact and importance of work on different cause areas.
Could you elaborate a bit on why? This doesn’t sound insane to me, but it is a pretty big disagreement with 80,000 Hours, and I am more sympathetic to 80K’s position on this.
My claim is that the fact that so many (smart, capable) people care about climate change work directly causes it to have lower expected value (on the margin). The “impact and importance of work on different cause areas” intimately depends on how many (smart, capable) people are already working or planning to work in those areas, so trying to communicate that impact and importance without taking into account “how many people already care” is fundamentally misguided.
The claim that climate change is a major PR issue for EA, if true, is evidence that EA’s position on climate change is (in at least this one respect) correct.
I’d like to extend my previous model to have three steps:
(1) EA downplays the impact of climate change (e.g. focusing on x-risk, downplaying mainstream impacts).
(2) EA downplays the value of working on climate change (e.g. low neglectedness, low tractability).
(3) EA discourages people from working on climate change in favor of other causes.
I think you are arguing that since lots of people already care about climate change, (3) is a sensible outcome for EA. To put this more explicitly, I think you are supportive of (2), likely due to you perceiving it as being not neglected. As I’ve stated in this post, I think it is possible to argue that climate change is not actually neglected. In fact you suggested below some good arguments for this:
But let’s put that to one side for a moment, because we haven’t talked about (1) yet. Even if you think the best conclusion for EA to make is (3), I still think it’s important that this conclusion is visibly drawn from the best possible information about the expected impacts of climate change. Sections (1), (4), (5), and (7) in my post speak directly to this point.
I look at the way that EA talks about climate change and I think it misses some important points (particularly see section (4) of my post). These gaps in EA’s approach to climate change cause me to have lower trust in EA cause prioritization, and at the more extreme end make me think “EA is the community who don’t seem to care as much about climate change—they don’t seem to think the impact will be so bad”. I think that’s a PR issue for EA.
Two things:
I made those up on the spur of the moment. Possibly you’re just being polite, but I would be very suspicious if all three turned out to be good arguments supporting work on climate change. I said immediately below that I don’t especially believe any of them in the case of climate change.
More importantly, the whole point of coming up with those arguments was that they didn’t depend on claims about neglectedness! None of those are arguments that climate change is neglected, they are potential shapes of arguments for why you might want to prioritise it despite it not being neglected.
I feel like we’re still not connecting regarding the basic definition of neglectedness. You seem to be mixing it up with scale and tractability in a way that isn’t helpful to precise communication.