Executive summary: This post offers a detailed summary and critical commentary on Chapter 4 of More Everything Forever, a new book by Adam Becker that presents a strongly critical, leftist-aligned analysis of Effective Altruism (EA), longtermism, and Rationalist ideas, arguing that EA’s speculative and utilitarian framework is politically naive and ethically misguided—though the author of the post ultimately finds Becker’s critique familiar, ideologically constrained, and unpersuasive.
Key points:
Becker critiques the culture and infrastructure of EA through descriptions of Trajan House and interviews with figures like Anders Sandberg, portraying the community as a mix of academia, tech startup culture, and speculative futurism (e.g., cryonics).
Main intellectual targets are longtermism and existential risk priorities—Becker challenges Ord’s 1-in-6 x-risk estimate and criticizes the deprioritization of climate change relative to other speculative risks like AGI.
Political critique of EA’s influence and funding highlights ties to powerful institutions (e.g., Open Philanthropy, RAND, UK political actors), arguing this represents elite overreach and ideological overconfidence.
Philosophical and methodological objections focus on Utilitarianism and Pascalian Muggings, arguing that longtermist reasoning is hypersensitive to speculative assumptions and lacks empirical robustness, especially compared to climate science.
Post author pushes back on the critique, arguing that Becker omits EA’s contributions to global health and poverty, misrepresents common EA positions, and presents a reductive leftist framework that fails to seriously engage with utilitarian ethics or pluralistic intellectual inquiry.
The author concludes that while critique is valid and should be welcome, Becker’s framing feels ideologically rigid, dismissive of good-faith philosophical exploration, and more focused on scoring points than engaging EA’s best ideas.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.
Executive summary: This post offers a detailed summary and critical commentary on Chapter 4 of More Everything Forever, a new book by Adam Becker that presents a strongly critical, leftist-aligned analysis of Effective Altruism (EA), longtermism, and Rationalist ideas, arguing that EA’s speculative and utilitarian framework is politically naive and ethically misguided—though the author of the post ultimately finds Becker’s critique familiar, ideologically constrained, and unpersuasive.
Key points:
Becker critiques the culture and infrastructure of EA through descriptions of Trajan House and interviews with figures like Anders Sandberg, portraying the community as a mix of academia, tech startup culture, and speculative futurism (e.g., cryonics).
Main intellectual targets are longtermism and existential risk priorities—Becker challenges Ord’s 1-in-6 x-risk estimate and criticizes the deprioritization of climate change relative to other speculative risks like AGI.
Political critique of EA’s influence and funding highlights ties to powerful institutions (e.g., Open Philanthropy, RAND, UK political actors), arguing this represents elite overreach and ideological overconfidence.
Philosophical and methodological objections focus on Utilitarianism and Pascalian Muggings, arguing that longtermist reasoning is hypersensitive to speculative assumptions and lacks empirical robustness, especially compared to climate science.
Post author pushes back on the critique, arguing that Becker omits EA’s contributions to global health and poverty, misrepresents common EA positions, and presents a reductive leftist framework that fails to seriously engage with utilitarian ethics or pluralistic intellectual inquiry.
The author concludes that while critique is valid and should be welcome, Becker’s framing feels ideologically rigid, dismissive of good-faith philosophical exploration, and more focused on scoring points than engaging EA’s best ideas.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.