Could you elaborate a bit on why? This doesn’t sound insane to me, but it is a pretty big disagreement with 80,000 Hours, and I am more sympathetic to 80K’s position on this.
My claim is that EA currently (1) downplays the impact of climate change (e.g. focusing on x-risk, downplaying mainstream impacts) and (2) downplays the value of working on climate change (e.g. low neglectedness, low tractability). If you agree that (1, 2) are true, then EA is misleading its members about climate change and biasing them to work on other issues.
Perhaps I have misunderstood your argument, but I think you’re saying that (1, 2) don’t matter because lots of people already care about climate change, so EA doesn’t need to influence more people to work on climate change. I would argue that regardless of how many people already care about climate change, EA should seek to communicate accurately about the impact and importance of work on different cause areas.
My claim is that the fact that so many (smart, capable) people care about climate change work directly causes it to have lower expected value (on the margin). The “impact and importance of work on different cause areas” intimately depends on how many (smart, capable) people are already working or planning to work in those areas, so trying to communicate that impact and importance without taking into account “how many people already care” is fundamentally misguided.
The claim that climate change is a major PR issue for EA, if true, is evidence that EA’s position on climate change is (in at least this one respect) correct.
The claim that climate change is a major PR issue for EA, if true, is evidence that EA’s position on climate change is (in at least this one respect) correct.
I’d like to extend my previous model to have three steps:
(1) EA downplays the impact of climate change (e.g. focusing on x-risk, downplaying mainstream impacts).
(2) EA downplays the value of working on climate change (e.g. low neglectedness, low tractability).
(3) EA discourages people from working on climate change in favor of other causes.
I think you are arguing that since lots of people already care about climate change, (3) is a sensible outcome for EA. To put this more explicitly, I think you are supportive of (2), likely due to you perceiving it as being not neglected. As I’ve stated in this post, I think it is possible to argue that climate change is not actually neglected. In fact you suggested below some good arguments for this:
“Even though lots of work has gone into solving climate change, the problem is so vastly complex and multidimensional that there’s still lots of low-hanging fruit left unpicked, so tractability remains high (and scale is large, so the two together are sufficient for impact).”
“Although lots of work has gone into solving climate change, partial solutions aren’t very valuable: most of the impact comes from the last few % of solving the problem. Also [for some reason] we can’t expect future work to continue at the same rate as current work, so more marginal work now is especially valuable.”
“While lots of resources have already gone into solving climate change, the problem is actually getting bigger all the time! So even though neglectedness is low and falling, returns from scale are high and rising, so marginal work on the problem remains valuable.”
But let’s put that to one side for a moment, because we haven’t talked about (1) yet. Even if you think the best conclusion for EA to make is (3), I still think it’s important that this conclusion is visibly drawn from the best possible information about the expected impacts of climate change. Sections (1), (4), (5), and (7) in my post speak directly to this point.
I look at the way that EA talks about climate change and I think it misses some important points (particularly see section (4) of my post). These gaps in EA’s approach to climate change cause me to have lower trust in EA cause prioritization, and at the more extreme end make me think “EA is the community who don’t seem to care as much about climate change—they don’t seem to think the impact will be so bad”. I think that’s a PR issue for EA.
In fact you suggested below some good arguments for this
Two things:
I made those up on the spur of the moment. Possibly you’re just being polite, but I would be very suspicious if all three turned out to be good arguments supporting work on climate change. I said immediately below that I don’t especially believe any of them in the case of climate change.
More importantly, the whole point of coming up with those arguments was that they didn’t depend on claims about neglectedness! None of those are arguments that climate change is neglected, they are potential shapes of arguments for why you might want to prioritise it despite it not being neglected.
I feel like we’re still not connecting regarding the basic definition of neglectedness. You seem to be mixing it up with scale and tractability in a way that isn’t helpful to precise communication.
Could you elaborate a bit on why? This doesn’t sound insane to me, but it is a pretty big disagreement with 80,000 Hours, and I am more sympathetic to 80K’s position on this.
My claim is that the fact that so many (smart, capable) people care about climate change work directly causes it to have lower expected value (on the margin). The “impact and importance of work on different cause areas” intimately depends on how many (smart, capable) people are already working or planning to work in those areas, so trying to communicate that impact and importance without taking into account “how many people already care” is fundamentally misguided.
The claim that climate change is a major PR issue for EA, if true, is evidence that EA’s position on climate change is (in at least this one respect) correct.
I’d like to extend my previous model to have three steps:
(1) EA downplays the impact of climate change (e.g. focusing on x-risk, downplaying mainstream impacts).
(2) EA downplays the value of working on climate change (e.g. low neglectedness, low tractability).
(3) EA discourages people from working on climate change in favor of other causes.
I think you are arguing that since lots of people already care about climate change, (3) is a sensible outcome for EA. To put this more explicitly, I think you are supportive of (2), likely due to you perceiving it as being not neglected. As I’ve stated in this post, I think it is possible to argue that climate change is not actually neglected. In fact you suggested below some good arguments for this:
But let’s put that to one side for a moment, because we haven’t talked about (1) yet. Even if you think the best conclusion for EA to make is (3), I still think it’s important that this conclusion is visibly drawn from the best possible information about the expected impacts of climate change. Sections (1), (4), (5), and (7) in my post speak directly to this point.
I look at the way that EA talks about climate change and I think it misses some important points (particularly see section (4) of my post). These gaps in EA’s approach to climate change cause me to have lower trust in EA cause prioritization, and at the more extreme end make me think “EA is the community who don’t seem to care as much about climate change—they don’t seem to think the impact will be so bad”. I think that’s a PR issue for EA.
Two things:
I made those up on the spur of the moment. Possibly you’re just being polite, but I would be very suspicious if all three turned out to be good arguments supporting work on climate change. I said immediately below that I don’t especially believe any of them in the case of climate change.
More importantly, the whole point of coming up with those arguments was that they didn’t depend on claims about neglectedness! None of those are arguments that climate change is neglected, they are potential shapes of arguments for why you might want to prioritise it despite it not being neglected.
I feel like we’re still not connecting regarding the basic definition of neglectedness. You seem to be mixing it up with scale and tractability in a way that isn’t helpful to precise communication.