I’m sure each individual critic of EA has their own reasons. That said (intuitively, I don’t have data to back this up, this is my guess) I suspect two main things, pre-FTX.
Firstly, longtermism is very criticisable. It’s much more abstract, focuses less on doing good in the moment, and can step on causes like malaria prevention that people can more easily emotionally get behind. There is a general implication of longtermism that if you accept its principles, other causes are essentially irrelevant.
Secondly, everything I just said about longtermism → neartermism applies to EA → regular charity—just replace “Doing good in the moment” with “Doing good close to home”. When I first signed up for an EA virtual program, my immediate takeaway was that most of the things I had previously cared about didn’t matter. Nobody said this out loud, they were scrupulously polite about it, they were 100% correct, and it was a message that needed to be shared to get people like me on board. This is a feature, not a bug, of EA messaging. But this is not a message that people enjoy hearing. The things people care about are generally optimised for having people care about them—as examples, see everything trending on Twitter. As a result, people don’t react well to being told, whether explicitly or implicitly, that they should stop caring about (My personal example here) the amount of money Australian welfare recipients get, and care about malaria prevention halfway across the world instead.
One difference between EA and longtermism is that people rarely criticise neartermism to the same level, because then you can just point out the hundreds of thousands of lives that neartermism has already saved, and they look like an asshole. Longtermism has no such defense, and a lot of people equate that with the EA movement—sometimes out of intellectual dishonesty, and sometimes because longtermism genuinely is a large and growing part of EA.
I’m sure each individual critic of EA has their own reasons. That said (intuitively, I don’t have data to back this up, this is my guess) I suspect two main things, pre-FTX.
Firstly, longtermism is very criticisable. It’s much more abstract, focuses less on doing good in the moment, and can step on causes like malaria prevention that people can more easily emotionally get behind. There is a general implication of longtermism that if you accept its principles, other causes are essentially irrelevant.
Secondly, everything I just said about longtermism → neartermism applies to EA → regular charity—just replace “Doing good in the moment” with “Doing good close to home”. When I first signed up for an EA virtual program, my immediate takeaway was that most of the things I had previously cared about didn’t matter. Nobody said this out loud, they were scrupulously polite about it, they were 100% correct, and it was a message that needed to be shared to get people like me on board. This is a feature, not a bug, of EA messaging. But this is not a message that people enjoy hearing. The things people care about are generally optimised for having people care about them—as examples, see everything trending on Twitter. As a result, people don’t react well to being told, whether explicitly or implicitly, that they should stop caring about (My personal example here) the amount of money Australian welfare recipients get, and care about malaria prevention halfway across the world instead.
One difference between EA and longtermism is that people rarely criticise neartermism to the same level, because then you can just point out the hundreds of thousands of lives that neartermism has already saved, and they look like an asshole. Longtermism has no such defense, and a lot of people equate that with the EA movement—sometimes out of intellectual dishonesty, and sometimes because longtermism genuinely is a large and growing part of EA.