Sadly, I agree with many of the points in this article.
“Just as the astrologer promises us that ‘struggle is in our future’ and can therefore never be refuted, so too can the longtermist simply claim that there are a staggering number of people in the future, thus rendering any counter argument mute,” he wrote in a post on the Effective Altruism forum. This matters, Chugg told me, because “You’re starting to pull numbers out of hats, and comparing them to saving living kids from malaria.”
I’ve been thinking this for a long time but not been able to put together something so succinct. Personally, I will carry on championing my interpretation of EA that is to look at charity like your investments and get the best bang for your buck. Wether I’ll use the term ‘EA’ to describe myself will depend on the next few months—if the general understanding of EA is speculative longtermism, cultish behavior, and ‘ends justify the means’ then I’d rather not bring it up.
Maybe EA will split in two, one group carrying on like they are now with a focus on longtermism and another that focuses solely on real impacts that can be seen and measured within our lifetimes. Maybe it doesn’t matter as long as you and I keep it in our minds when we make our donations to charity funding malaria nets saving real lives today, no matter how small that impact might be compared to SBF and the future trillions of humans at risk of AI going rogue on Mars.
Edit: Not to say longtermism doesn’t have its place, I just feel too much time is spent on these things that may never happen while real people face real issues today (or may face in the near future, like pandemic preparedness).
Sadly, I agree with many of the points in this article.
I’ve been thinking this for a long time but not been able to put together something so succinct. Personally, I will carry on championing my interpretation of EA that is to look at charity like your investments and get the best bang for your buck. Wether I’ll use the term ‘EA’ to describe myself will depend on the next few months—if the general understanding of EA is speculative longtermism, cultish behavior, and ‘ends justify the means’ then I’d rather not bring it up.
Maybe EA will split in two, one group carrying on like they are now with a focus on longtermism and another that focuses solely on real impacts that can be seen and measured within our lifetimes. Maybe it doesn’t matter as long as you and I keep it in our minds when we make our donations to charity funding malaria nets saving real lives today, no matter how small that impact might be compared to SBF and the future trillions of humans at risk of AI going rogue on Mars.
Edit: Not to say longtermism doesn’t have its place, I just feel too much time is spent on these things that may never happen while real people face real issues today (or may face in the near future, like pandemic preparedness).