the article nevertheless portrays the EA community as if it is pushing frivolous, ill-considered ideas instead of supporting the Real, Serious concerns held by Thoughtful and Reasonable people.
What bothers me is that many criticisms of EA that hinge on “EA is neglecting this angle in a careless and malicious manner” could have been addressed with basic Googling.
I don’t expect the average Joe to actively research EA, but someone who’s creating a longform written or video essay with multiple sources should be held to higher standards.
Thus, we must be wary of the power behind a mindset focused solely on the hypothetical future and allow space and empathy for the short term needs of society. A tyranny of the quantifiably rational majority would lead to more quantifying of human suffering than policy change.
This conclusion somehow manages to completely ignore the neartermist cause areas, the frequent discussions about prioritising neartermism vs longtermism and the research on neartermism that does focus on qualitative wellbeing. I genuinely don’t know how someone can read dozens of pages about EA and not come across any reference to neartermism.
Ultimately, I just read so many EA criticism pieces where it feels like the writer hasn’t talked to EAs, or conveniently ignores that most EAs spend most of their work time thinking about how to solve real problems that affect people.
These articles paint a picture of EAs so completely divorced from my actual interactions with EAs. The actual object-level work done by EA orgs is often described in 1-2 sentences, while an entire article is devoted to organisational drama/conflict. Like … how do you talk about EA without mentioning the work EAs do to … solve problems???
What bothers me is that many criticisms of EA that hinge on “EA is neglecting this angle in a careless and malicious manner” could have been addressed with basic Googling.
I don’t expect the average Joe to actively research EA, but someone who’s creating a longform written or video essay with multiple sources should be held to higher standards.
One example: Effective Altruism and the Cult of Rationality: Shaping the Political Future from FTX to AI — COLUMBIA POLITICAL REVIEW (cpreview.org)
This conclusion somehow manages to completely ignore the neartermist cause areas, the frequent discussions about prioritising neartermism vs longtermism and the research on neartermism that does focus on qualitative wellbeing. I genuinely don’t know how someone can read dozens of pages about EA and not come across any reference to neartermism.
Ultimately, I just read so many EA criticism pieces where it feels like the writer hasn’t talked to EAs, or conveniently ignores that most EAs spend most of their work time thinking about how to solve real problems that affect people.
These articles paint a picture of EAs so completely divorced from my actual interactions with EAs. The actual object-level work done by EA orgs is often described in 1-2 sentences, while an entire article is devoted to organisational drama/conflict. Like … how do you talk about EA without mentioning the work EAs do to … solve problems???