fyi for anyone like me who doesn’t have lots of the backstory here and doesn’t want to read through Habryka’s extensive corpus of EAF writings, here is Claude 3.7 Sonnet’s summary based on the first page of comments Habryka links to.
Based on Habryka’s posts, I can provide a summary of his key disagreements with EA leadership and forum administrators that ultimately led to his decision to leave the community.
Key Disagreements
Leadership and Accountability: Habryka repeatedly expresses concern about what he sees as a “leaderless” EA community. He believes the community has shifted from being driven by independent intellectual contributors to being determined by “a closed-off set of leaders with little history of intellectual contributions.” He argues that almost everyone who was historically in leadership positions has stepped back and abdicated their roles.
Institutional Integrity: He criticizes EA organizations, particularly CEA (Centre for Effective Altruism), for prioritizing growth, prestige, and public image over intellectual integrity. In his posts, he describes personal experiences at CEA where they “deployed highly adversarial strategies” to maintain control over EA’s public image and meaning.
FTX Situation: Habryka was particularly critical of how EA leadership handled Sam Bankman-Fried (SBF) and FTX. He claims to have warned people about SBF’s reputation for dishonesty, but these warnings were not heeded. He criticizes Will MacAskill and others for their continued endorsement of SBF despite red flags, and was frustrated by the lack of transparency and open discussion after FTX’s collapse.
Risk-Aversion and PR Focus: He repeatedly criticizes what he perceives as excessive risk-aversion and PR-mindedness among EA organizations. He argues this approach prevents honest discussion of important issues and contributes to a culture of conformity.
Funding Centralization: Habryka expresses concern about EA funding being increasingly centralized through a single large foundation (likely referring to Open Philanthropy), arguing this concentration of resources creates unhealthy power dynamics.
Community Culture: He criticizes the shift in EA culture away from what he describes as “a thriving and independent intellectual community, open to ideas and leadership from any internet weirdo” toward something more institutional and conformist.
Failure to Create Change: Habryka states that he no longer sees “a way for arguments, or data, or perspectives explained on this forum to affect change in what actually happens with the extended EA community,” particularly in domains like AI safety research and community governance.
His departure post suggests a deep disillusionment with the direction of the EA community, expressing that while many of the principles of EA remain important, he believes “EA at large is causing large harm for the world” with “no leadership or accountability in-place to fix it.” He recommends others avoid posting on the EA Forum as well, directing them to alternatives like LessWrong.
Hmm, I’m not a fan of this Claude summary (though I appreciate your trying). Below, I’ve made a (play)list of Habryka’s greatest hits,[1] ordered by theme,[2][3] which might be another way for readers to get up to speed on his main points:
‘Greatest hits’ may be a bit misleading. I’m only including comments from the post-FTX era, and then, only comments that fit within the core themes. (This is one example of a great comment I haven’t included.)
My ordering is quite different to the karma ordering given on the GreaterWrong page Habryka links to. I think mine does a better job of concisely covering Habryka’s beliefs on the key topics. But I’d be happy to take my list down if @Habryka disagrees (just DM me).
Here’s another summary. I used Gemini 2.0 Flash (via the API), and this prompt:
The following is a series of comments by Habryka, in which he makes a bunch of criticisms of the effective altruism (EA) movement. Please look at these comments and provide a summary of Habryka’s main criticisms.
Lack of leadership and accountability: He believes EA leadership is causing harm and lacks mechanisms for correcting course.
Emphasis on PR and narrative control: He condemns EA organizations’ risk aversion, guardedness, and attempts to control the narrative around FTX, prioritizing public image over transparency.
Inadequate community health: He laments conformity pressures, fears of reprisal for dissent, and insufficient efforts to cultivate a culture of open disagreement.
Entanglement with FTX: He faults EA leadership, particularly Will MacAskill, for endorsing Sam Bankman-Fried and entangling the movement with FTX despite warnings about SBF’s character.
Hero worship and lack of respect for intellectual leaders: He criticizes the hero worship of MacAskill, contrasting it with MacAskill’s perceived lack of engagement with other intellectual leaders in the community. He sees this as part of a pattern of MacAskill prioritizing popularity and prestige over community health and epistemic integrity.
Misleading communications and lack of transparency: He criticizes CEA for making inaccurate and misleading statements, for omitting crucial context in communications, and for concealing information about funding decisions.
Scaling too quickly and attracting grifters: He worries that EA’s rapid growth and increased funding attract deceptive actors and create perverse incentives.
Overreliance on potentially compromised institutions: He expresses concerns about EA’s deep ties to institutions like Oxford University, which may stifle intellectual exploration and operational capacity.
Ignoring internal warnings about FTX: He reveals that he and others warned EA leadership about Sam Bankman-Fried’s reputation for dishonesty, but those warnings went unheeded. He suggests he personally observed potentially illegal activities by SBF but chose not to share this information more widely.
Flawed due diligence and poor judgment in grantmaking: He feels EA leadership’s due diligence on SBF was inadequate and that they made poor judgments in providing him with substantial resources. He extends this criticism to grantmaking practices more generally.
Unfair distribution of resources: He argues that the current distribution of funds within EA doesn’t adequately compensate those doing object-level work and undervalues their contributions relative to donors. He argues for a system that recognizes the implicit tradeoff many have made in pursuing lower-paying EA-aligned careers.
Centralized media policy and negative experiences with journalists: While supporting a less centralized media policy, he also cautions against interacting with journalists, as they frequently misrepresent interviewees and create negative experiences.
fyi for anyone like me who doesn’t have lots of the backstory here and doesn’t want to read through Habryka’s extensive corpus of EAF writings, here is Claude 3.7 Sonnet’s summary based on the first page of comments Habryka links to.
Hmm, I’m not a fan of this Claude summary (though I appreciate your trying). Below, I’ve made a (play)list of Habryka’s greatest hits,[1] ordered by theme,[2][3] which might be another way for readers to get up to speed on his main points:
Leadership
The great OP gridlock
On behalf of the people
You are not blind, Zachary[4]
Reputation[5]
Convoluted compensation
Orgs don’t have beliefs
Not behind closed doors
Funding
Diverse disappointment
Losing our (digital) heart
Blacklisted conclusions
Impact
Many right, some wrong
Inverting the sign
A bad aggregation
‘Greatest hits’ may be a bit misleading. I’m only including comments from the post-FTX era, and then, only comments that fit within the core themes. (This is one example of a great comment I haven’t included.)
although the themes overlap a fair amount
My ordering is quite different to the karma ordering given on the GreaterWrong page Habryka links to. I think mine does a better job of concisely covering Habryka’s beliefs on the key topics. But I’d be happy to take my list down if @Habryka disagrees (just DM me).
For context, Zachary is CEA’s CEO.
FWIW it looks like claude is only summarising this quick take (all the quotes are from it)
Here’s another summary. I used Gemini 2.0 Flash (via the API), and this prompt:
Lack of leadership and accountability: He believes EA leadership is causing harm and lacks mechanisms for correcting course.
Emphasis on PR and narrative control: He condemns EA organizations’ risk aversion, guardedness, and attempts to control the narrative around FTX, prioritizing public image over transparency.
Inadequate community health: He laments conformity pressures, fears of reprisal for dissent, and insufficient efforts to cultivate a culture of open disagreement.
Entanglement with FTX: He faults EA leadership, particularly Will MacAskill, for endorsing Sam Bankman-Fried and entangling the movement with FTX despite warnings about SBF’s character.
Hero worship and lack of respect for intellectual leaders: He criticizes the hero worship of MacAskill, contrasting it with MacAskill’s perceived lack of engagement with other intellectual leaders in the community. He sees this as part of a pattern of MacAskill prioritizing popularity and prestige over community health and epistemic integrity.
Misleading communications and lack of transparency: He criticizes CEA for making inaccurate and misleading statements, for omitting crucial context in communications, and for concealing information about funding decisions.
Scaling too quickly and attracting grifters: He worries that EA’s rapid growth and increased funding attract deceptive actors and create perverse incentives.
Overreliance on potentially compromised institutions: He expresses concerns about EA’s deep ties to institutions like Oxford University, which may stifle intellectual exploration and operational capacity.
Ignoring internal warnings about FTX: He reveals that he and others warned EA leadership about Sam Bankman-Fried’s reputation for dishonesty, but those warnings went unheeded. He suggests he personally observed potentially illegal activities by SBF but chose not to share this information more widely.
Flawed due diligence and poor judgment in grantmaking: He feels EA leadership’s due diligence on SBF was inadequate and that they made poor judgments in providing him with substantial resources. He extends this criticism to grantmaking practices more generally.
Unfair distribution of resources: He argues that the current distribution of funds within EA doesn’t adequately compensate those doing object-level work and undervalues their contributions relative to donors. He argues for a system that recognizes the implicit tradeoff many have made in pursuing lower-paying EA-aligned careers.
Centralized media policy and negative experiences with journalists: While supporting a less centralized media policy, he also cautions against interacting with journalists, as they frequently misrepresent interviewees and create negative experiences.
The original post is only 700 words, and this is like half that length. Can you not give people the dignity of reading their actual parting words?
Pablo and I were trying to summarise the top page of Habryka’s comments that he linked to (~13k words) not this departure post itself.
Hmm true, I gave it the whole Greater Wrong page of comments, maybe it just didn’t quote from those for some reason.