I worry that EA is over-focusing on things we can fix relatively easily, like disease and farmed animal welfare, at the detriment of things that could alter the trajectory of our civilisation.
At a few points in this post, you argue for climate change possibly being more important to tackle than the major short-termist causes within EA. Do you also see it as something which should redirect resources currently being spent on long-termist causes? It’s odd to see a claim that EA doesn’t focus enough on “things that could alter the trajectory of our civilization” when most popular critiques of EA say the opposite (too much focus on long-term risks, not enough on helping people in clear/direct ways).
On the other hand, one benefit of comparing climate change to disease/animal welfare is that the donor demographics for climate change may be more similar to the demographics for disease/animal donors than for donors to other X-risks (e.g. people with an environmentalist bent, people interested in the Global South).
This intellectual heritage and the focus on short-term numbers and results influenced by the short-term world of hedge funds, and this approach could be at the detriment of missing out on broader changes.
This is a small nitpick, but I don’t think I’ve ever seen the claim substantiated that EA’s focus has been unduly influenced by “the short-term world of hedge funds”, even though people make it all the time. Yes, GiveWell was founded by hedge-fund veterans, but the tools they borrowed from Bridgewater were (as far as I know) related to EV calculations, not “having a short time horizon”. EA has, almost since the beginning, had a stronger focus on the long-term future than nearly any other social movement.
Do you also see it as something which should redirect resources currently being spent on long-termist causes?
No, I think that funding averted from AI alignment to climate change would be a mistake. But optimising money currently spent on climate change could be useful.
This is a small nitpick, but I don’t think I’ve ever seen the claim substantiated that EA’s focus has been unduly influenced by “the short-term world of hedge funds”, even though people make it all the time. Yes, GiveWell was founded by hedge-fund veterans, but the tools they borrowed from Bridgewater were (as far as I know) related to EV calculations, not “having a short time horizon”. EA has, almost since the beginning, had a stronger focus on the long-term future than nearly any other social movement.
Yes, it felt a little harsh of me to have written that. I agree—it’s a bit of strawman argument. I think what I was thinking there is perhaps better expressed in the quote from Christine Peterson.
At a few points in this post, you argue for climate change possibly being more important to tackle than the major short-termist causes within EA. Do you also see it as something which should redirect resources currently being spent on long-termist causes? It’s odd to see a claim that EA doesn’t focus enough on “things that could alter the trajectory of our civilization” when most popular critiques of EA say the opposite (too much focus on long-term risks, not enough on helping people in clear/direct ways).
On the other hand, one benefit of comparing climate change to disease/animal welfare is that the donor demographics for climate change may be more similar to the demographics for disease/animal donors than for donors to other X-risks (e.g. people with an environmentalist bent, people interested in the Global South).
This is a small nitpick, but I don’t think I’ve ever seen the claim substantiated that EA’s focus has been unduly influenced by “the short-term world of hedge funds”, even though people make it all the time. Yes, GiveWell was founded by hedge-fund veterans, but the tools they borrowed from Bridgewater were (as far as I know) related to EV calculations, not “having a short time horizon”. EA has, almost since the beginning, had a stronger focus on the long-term future than nearly any other social movement.
Thanks for your comments.
No, I think that funding averted from AI alignment to climate change would be a mistake. But optimising money currently spent on climate change could be useful.
Yes, it felt a little harsh of me to have written that. I agree—it’s a bit of strawman argument. I think what I was thinking there is perhaps better expressed in the quote from Christine Peterson.