Does Sam make me want to renounce the actions of the EA community? No. Does your reaction? Absolutely.
Alright, you finally broke me. No honey, all vinegar.
I have spent close to ten years in EA. Over those ten years, I have worked extremely hard and invested in our community. I have organized and helped newcomers and junior members when I’m exhausted, when I’m facing family crises, when the last thing I want to do is get on the phone or train. Instead of going to the private sector and making money, I have stuck around doing impactful jobs that I largely hate as a way of dedicating myself to helping others and this community to thrive. I don’t regret this. Do FTX or Sam Bankman-Fried’s actions change my assessment of my actions? No. Do the reactions I have seen here? Yes, I’m afraid so.
The things that draw me to the EA community are, above anything else, its commitment to supporting one another and *working together as a team* to reduce suffering. Throughout the years, I have held friends as they cried because a project they were doing failed, celebrated their successes, and watched again and again as EAs do the brave thing and are then there for one another. Through this, I have built friendships and relationships that are the joys of my life.
Now the new catechism. Do I condemn fraud? Yes. Of course, I do. This is a stupid question EAs keep performatively asking and answering. Everyone opposes fraud, there is no one on the other side of this issue. Sam’s actions were awful and I condemn them. Do I believe we should circle our wagons and defend Sam? No. However, there is a huge difference between condemning his actions while rallying together to support one another through this awful time and what I see happening here which I believe can best be described as a witch-hunt against everything and everyone that ever intersected with Sam or his beliefs.
Over the last few days, posters on this forum, Twitter, and Facebook have used this scandal to attack and air every single grievance they have ever had against Effective Altruism or associated individuals and organizations. Even imaginary ones. Especially imaginary ones. This has included Will MacAskill and other thought leaders for the grave sin of not magically predicting that someone whose every external action suggested that he wanted to work with us to make the world a better place, would YOLO it and go Bernie Madoff. The hunt has included members of Sam’s family for the grave sin of being related to him. It has included attributing the cause of Sam’s actions to everything from issues with diversity and inclusivity, lack of transparency in EAG admissions, the pitfalls of caring if we all get eviscerated by a nuke or rogue AI, and, of course, our office spaces and dating habits.
Like Stop. There are lessons to be learned here and I would have been fully down for learning them and working together to fix them with *all of you.* But why exactly should I help those in the community who believe that the moral thing to do when someone is on their knees is to jump on them while yelling “I should have been admitted to EAG 2016!”? Why should I expose myself further by doing ambitious things (No I don’t mean fraud- that’s not an ambitious thing that’s a—criminal—thing) when if I fail people are going to make everything worse by screaming “I told you so” to signal that they never would have been such a newb? Yeah. No. The circle I’m drawing around who is and is not in my community is getting dramatically redrawn. This is not because one person or company made a series of very bad decisions, it’s because so many of your actions are those of people I will not invest in further and who I don’t want anywhere near my life or life’s work. I’ll keep the Julia Wises, and Cate Halls- their kindness has blown me away, I’ll keep the people who are working together to fix this. The rest of you, yeah, no thanks.*
If this is a witch hunt, and based on who I’m seeing burnt, and who I’m seeing in the mob to quote Tay-Tay, go ahead and LIGHT.ME.UP.
*(My friends say this is a minority of people in EA. I will reserve judgment until I stop wanting to strip naked and burn my EA- t-shirt in the town square while swigging soylent, amped up on Adderall, and live-steaming it so I can keep up with my polycule. Lizka—I’m sorry—I’ll see myself out.)
- 13 Jan 2023 2:10 UTC; 61 points) 's comment on I am tired. by (
- EA & LW Forums Weekly Summary (14th Nov − 27th Nov 22′) by 29 Nov 2022 22:59 UTC; 22 points) (
- EA & LW Forums Weekly Summary (14th Nov − 27th Nov 22′) by 29 Nov 2022 23:00 UTC; 21 points) (LessWrong;
- 10 Feb 2023 20:45 UTC; 2 points) 's comment on EA Community Builders’ Commitment to Anti-Racism & Anti-Sexism by (
- 18 Nov 2022 19:07 UTC; 1 point) 's comment on EA is a global community—but should it be? by (
A contingent of EAs (e.g., Oliver Habryka and the early Alameda exodus) seems to have had strongly negative views of SBF well in advance of the FTX fraud coming to light. So I think it’s worthwhile for some EAs to do a postmortem on why some people were super worried and others were (apparently) not worried at all.
Otherwise, I agree with you that folks have seemed to overreact more than underreact, and that there have been a lot of rushed overconfident claims, and a lot of hindsight-bias-y claims.
The way to avoid hindsight bias here, I think, is to ask what other similar contingents have similar negative views about what else.
Will we be flooded with 100 targets that people have negative views about? If so, maybe this kind of signal is usually a false positive
Will there only be 2 others? If so, maybe we should deal with those 2
I have maybe 1-2 other people/organizations that I feel as doomy about as I did for SBF.
That said, there are definitely lots of other parts of EA that I feel unhappy about, and I do think there are some deeper things going wrong (like generally too much private scheming and empire building, and not enough blurting out of opinions and open curiosity and deep commitment to honesty).
Your contention: Too much planning and building, not enough blurting out “deeply honest” (and very negative!) opinions.
My contention: Stop burning each other online, EA is not Mean Girls. Shut up, play nice, go into the real world, build, and get shit done.
This actually matters!
Yep, definitely matters! I also think the current situation is some more evidence of my contention over your contention, though not like hugely so (at least without digging into the details).
At least in this case though we can agree that something terrible happened, so let’s start with an analysis of what things could have maybe prevented that. And maybe we will learn that it was really hard to prevent, or the prevention would have been too costly.
And I do agree and want to apologize at least a bit for pushing towards my contention in a way that I think was a bit too much driven by a specific inside view in which this did seem quite straightforwardly caused by a bunch of stuff that I have been concerned about for a while, but I don’t think I’ve explicated those models in sufficient detail to be compelling to many people.
Ollie has a very high false positive rate. I can’t imagine a person or project he wouldn’t get off on kicking the legs out from under (https://getyarn.io/yarn-clip/1767181b-4b9d-4f7f-95fa-be4cae266511). If I join in on this “re-drawing lines around the community” exercise he can stand with Kerry Vaugn, Emile Torres, and Robin Hanson on the other side of my big chalk circle.
No one in the Alameda exodus expected fraud. They just thought Sam was terrible on other dimensions, they’ve said as much. Heck, Lantern lost money they were holding on FTX because they trusted it there.
This feels kind of strawmanny.
I said many times neither me nor others predicted the scale of the fraud and explosion at FTX.
I do think it was clear that Lantern was disassociating themselves from Sam, and stopped giving Sam resources, and that is the primary thing I think we should have done too, based on roughly the same info that Lantern and other past Alameda employees had.
About false-positives: I agree that false-positives are a key thing to pay attention to. I do think I have concerns about a bunch of things in EA here, though I think it’s really far from everyone, and also, I don’t think I have a historically bad track record (like, I did say that CEA was really corrupt and broken during 2015-2017 and I think that is accurate and I think current CEA would agree with that. I also think my concerns about Leverage are well-warranted. I was also one of the people most involved with kicking Diego out of the community, and in the Bay Area warned a lot of people about Brent earlier than others. I also think I was too pessimistic about CEA after 2018, and I was wrong about the future of EA Funds when I was frustrated with it at various points.)
We can make a concrete list of organizations and public intellectuals if you want, and then people can judge on their own if I have a huge false-positive rate.
(For some random examples: I think the 80k podcast is great. I think SSC/ACX is great. I think Open Phil’s research team does a lot of good work despite me deeply disagreeing with them a lot. I think MIRI has done good work but also produced a depression machine that made everyone there depressed. I think FHI was really great before it scaled a lot, now it is a sad husk of its former self, as the university has been smothering it. I think Nicole Ross’s team is great and they do valuable work, though I think they lack ambition. I think Julia Wise’s stuff is high-variance and sometimes has bad consequences, but I’ve come around over the years to thinking that it’s probably good for the world, though I am still hesitant. I think Redwood Research is genuinely well-intentioned and doing good work, and I trust Buck a lot, though almost all the work they do doesn’t seem to be on the critical path towards safe AI (but it’s still my favorite prosaic alignment place to send people to). I think Paul fucked up hard by endorsing OpenAI as much as he did, and I think something kind of bad is happening with his writing style, but he is a genuinely great thinker for AI Alignment and I’ve learned a lot from him about how to think about AI Alignment, and want him to have the resources to pursue his research even if it seems doomed to me. I think CFAR makes some great workshops, though also had some pretty fucked-up dynamics and Anna continues to have a not-great track record at identifying who will end up kind of crazy and causing a bunch of harm. I think Robin Hanson’s research and writing is great, though I heard he has some more dubious in-person behavior. I think the Atlas Fellowship seems pretty cool and I support it, though I think the presentation is too… I don’t know, fake-ambitious/EAish. In-general the Stanford EAs seem like they are doing cool stuff with a bunch of the events and workshops they are running.)
I’d be interested to hear what you think is going wrong with Paul’s writing style, if you want to share.
Guys, please have a go at people on another person’s post. God knows there are enough of them… This is exactly what I’m talking about and I will literally have a coronary. Lol.
Appreciate you ❤️
Oof. What did Gideon Futerman do? Break the wagon circle to air his loss of confidence in EA leadership? I think it’s pretty reasonable to have said confidence shaken, especially when no one is talking (seemingly putting their own reputations, and PR, ahead of what’s best for the future of the EA community).
Honestly not sure why I seem to have become NeoMohists enemy! Like I just posted some questions, sure they were questioning and not the most sympathetic to the leadership, but its hardly enough I think to warrant this. On the otherhand, I am sure NeoMohist is going through a difficult time like many of us, so I sort of get jumping to attack me (I’m sure ai have been similarly unreasonable at times in the past)
In case some readers need more context: The source of the “Some EAs thought Sam was bad” comments seem to originate with Tara Mac Aulay and some of her colleagues in 2018. Some of this group then started Lantern.
What are the false positives? Some of the big things I recall Ollie critiquing in the past are Leverage Research and “mainstream EA playing fast and loose with honesty”; those critiques seem to have aged well.
The claim isn’t “lots of people predicted fraud, therefore we should do a postmortem on why EA’s FTX boosters didn’t predict it”. Rather, the claim is that SBF had a bunch of red flags that plausibly would have sufficed for at least engaging with SBF with a lot more caution, as opposed to whole-heartedly embracing him in the way that a lot of EA did.
It might also have increased the probability of more of FTX’s shady and shoddy business practices coming to light, but I agree that this is more uncertain. The main question is just whether there were process or norm failures in terms of how we reacted to lesser warning signs. Lesser warning signs won’t always let you catch disasters in advance, but it does matter how we react to those signs—in expectation, whether or not it would have helped in this case.
As a moderator, I think a previous version of this comment was rude and clearly violated many Forum norms. Another comment also violates Forum norms.
While I appreciate the edit, this is a warning, if you leave another comment like these, you will probably receive a ban.
Habryka’s comments have been pretty strongly upvoted on the forum these days, indicating that folks have found them helpful. It might be useful for you to provide receipts or anything that supports claims like “Ollie has the highest false positive rate”, “Ollie is terrible for EA”, otherwise this is just a fairly baseless accusation.
See the heart-related, moderation request above. Berglund (aka. my new fav) has an excellent and I am told extra spicy new short form for this.
Seriously mate, I get your upset, but maybe take a break from the forum for a while? Like, this isn’t healthy. I get my critiques of the leadership I have has upset you, and maybe they are incorrect, but I don’t quite understand why you want me out of EA. Also, if you do want to find out who I am before wanting to remove me from your community, please check out my profile so you can know who I an 🙂
A bunch of things that all seem true to me:
Some number of people in EA community could have done things that were positive in expectation that would have mitigated much of the downside to EA from FTX.
A bunch of people are overreacting to this situation and making it seem much more damning to EA than I think it is. Some of those people are acting in bad faith.
It is very possible that as a community we overreact to this situation and adopt bad norms, institutions, or practices that are negative EV going forward.
Absolutely. We obviously can weather losing funding. EA started small and it can grow back. And people always have enjoyed heaping one form of abuse on it or another. The more fundamental damage will be what we inflict on ourselves.
But I’m still optimistic this will mostly blow over with respect to the EA movement. Mostly, I think that people are being louder than usual across the board, but they seem to be expressing opinions they’d already held. When it stops being as salient, people will probably more or less quiet down and keep pursuing the same types of goals and having the same perspectives they had previously. Hopefully in the context of improved movement governance.
I appreciate you for writing this! I don’t agree with everything you’ve written here, but I vibe a lot with the spirit. Hindsight bias and “my personal pet peeve against EA is responsible for everything bad that happened here” seem everywhere. My guess is that while there are important updates to be made, and many short term fires to put out, the actually important updates will be far easier to make in a few weeks once things feel calmer and less emotionally raw.
Thank you very much for your decade of hard work for EA, and for sharing some thoughts vividly and articulately that many of us have probably been feeling over the last few days.
I have seen a lot of ‘blood in the water’ reactions, where people with long-standing grievances about various alleged problems in EA (some real, some exaggerated, some imagined) have taken the FTX crisis as an opportunity for criticizing those problems in highly emotional and often unconstructive ways. This often seems driven not by a desire to strategize about how those problems can be rectified in the future, but to enjoy seeing certain people get their ‘comeuppance’ for some past misjudgment or misbehavior that is attributed to them, or to re-engineer EA’s culture in some new ideological or political direction.
This is standard human behavior. Whenever an organization or movement faces a sudden, massive PR crisis, people tend to start backbiting, recriminating, criticizing, re-opening old wounds, and, all too often, fracturing into warring cliques.
I hope that as people who have devoted our lives to taking a big step back from ‘normal’ human emotional reactivity, tribalism, and signaling, and to trying to hold our values and actions to a higher, more rational, more empathetic standard, we can all calm down, treat each other better, remember our common mission, and have confidence that EA will survive, adapt, grow, prosper, and do more of the good that we’re devoted to doing.
I was considering writing something similar, but investing time in writing a post is hard, so thank you for writing this.
While I understand emotions we all feel, I’m under an impression that effective altruism community is now reacting in a way that they try to educate public NOT to—to use anecdotal examples and emotional states to guide the decision making. While it’s very human to react in such a way I see that I’m growing more and more anxious of this direction observing the discourse.
Of course what happened is devastating and that there is a lot of constructive feedback and good questions to be asked, but I hope we can preserve our commitment to compassion and truth-seeking even in hard moments like this. The things that worry me can be seen in how demanding people are to the EA leaders—to the point of almost blaming them for this without enough evidence or sympathy. And that even not that good proposals earn a lot of support because I feel we are all incentivizied to be critical of EA (and it’s good to feel better about ourselves by distancing ourselves from the misery that FTX caused).
I’m also worried about EA doing things now (both community here and established orgs) to influence optics rather than for the sake of integrity. It would be worrying if true, because if so it may lead to the recreating potentials errors, like people silencing themselves instead of having good-faith arguments. I hope that accountability will prevail and both community and people will be open where we screwed up and, if needed, there will be consequences, instead of us protecting the brand of EA at any cost.
But I want to mention that I’m also incredibly impressed by some people here and generally very happy to consider myself part of this community. I admire the courage, integrity and sobriety of thinking of many people here. After spending recently way more time on EA Forums than I should, I came to conclusion that I would want to especially mention Habryka for his behavior and comments during the last period . It’s really a privilege to have such people in EA community (and I’m really sorry for not mentioning other people behaving in a similar way who I didn’t notice).
I think the point I have been trying to make in criticising the leadership is something like this:
EA has this slightly weird, very opaque and untransparent, sort of unaccountable leadership which we trust with a lot of power. We treat them, in many ways, like “philosopher-kings” of the community
We do this on the premise that these people are uniquely good at leading the community and uniquely good at seeing risks that might be posed to the EA enterprise, and if they were more accountable and public, this would reduce their ability to do this, so reduce the ability of EA to do good in the world
However, the FTX scandal has shown us that these EA leaders are just humans, with the same flaws and biases as the rest of us (as I think many of us suspected), and aren’t these superhuman philosopher kings
Thus, we probably shouldn’t centralise so much power and trust in such a small unaccountable opaque group of people.
Its not that these people aren’t worthy of compassion, or that I would have done better (I wouldn’t have done). Its that we defer dso much power to them on the assumption that they are so much better than us at this, and I just think they’re not.
I’m placing myself on a ban because I’m not my best self right now but briefly:
1. “EA has this slightly weird, very opaque and untransparent, sort of unaccountable leadership which we trust with a lot of power. We treat them, in many ways, like “philosopher-kings” of the community.”
I don’t think very many people treat him like this (certainly not the majority- literally, no one I know defers to him in this way and this strikes me as a super weird interpretation of his role in EA). But treating this as true how is this Will’s fault? At no point do I personally remember Will standing on a podium of What We Owe The Future, screaming “I am your leader bow to me, peasants!” I think I would have remembered that. Sounds like a hoot. Instead, the, and so help me I don’t know why I have to emphasize this, *living, breathing human being with feelings* is a philosopher and an academic who had an idea and promoted it because he thought it would make the world a better place. This resulted in a community and organizations inspired by his ideas—not -- governed by him. It’s a handful of organizations with distinct leaders and a handful of individuals with their own interpretations of his and other people’s work. Maybe people who deferred to Will should be the ones who are pondering their own epistemics.
2. On Lantern Ventures:
There is a big difference between not wanting to work with someone because you don’t like their ethics (looking at you Kerry) and thinking they are going to commit the century’s worst fraud. Also, she’s getting death threats. Do you not reckon she’s suffering enough without her community adding their voices to the mob of people saying she should have known or that she could have prevented this? I remember when this stuff happened, it was on the periphery of EA at best. I literally don’t think I thought about it for longer than around 10 minutes and I had friends personally involved at the time. My take was “Wow! Sam sounds like a bad manager” not “Uhoh best nail down the furniture.”
3. On Dancer, and a handful of other people being kind while being critical:
Thank you. You can stay. ;)
I don’t want to get into debates around object-level criticisms this early but I keep being puzzled by this assertion:
”This resulted in a community and organizations inspired by his ideas—not -- governed by him. It’s a handful of organizations with distinct leaders and a handful of individuals with their own interpretations of his and other people’s work.”
There was also a similar quote elsewhere:
”But Will is not the CEO of EA! He’s a philosopher who writes books about EA and has received a bunch of funding to do PR stuff.”
I don’t think this conception of “people loosely connected together in various ways” really captures the correct level of accountability here. There is a legal entity named Effective Ventures, which is the umbrella organisation of CEA, 80000 Hours, GWWC etc. and Will is the president of Effective Ventures as well as CEA. The people in the community volunteer their time and credibility by referring these organisations(and their literature) to their social circles. Many also do donate money to these organisations.
I refuse to have a verdict on FTX related criticisms until the dust settles, and most of the non-FTX related criticisms seem unreasonable to me, but this argument of “no one is the leader of EA really” strikes me as quite odd.
I suspect CEA might even be the official copyright owner for “Effective Altruism” brand as I don’t see any organisation that has “Effective Altruism” in its name despite not being approved by CEA.Please inform me on this if I’m wrong. EA is much more centralised than “Socialism” or “Feminism”.Correction: No one owns “Effective Altruism” as a trademark. More detailed information here.
I like your comment- thank you. I am really confused by the opposite view. I don’t quite understand where Will as governor of EA comes from. There are influential organizations and thinkers in EA for sure but ownership and ultimate responsibility particularly cosmic responsibility for bad actors feels very different… I think something is obviously going on here which is important and once things are less awful we should try and dissect calmly, kindly, and deliberately.
I don’t think anyone asking for more information about what people knew believes that central actors knew anything about fraud. If that is what you think, then maybe therein lies the rub. It is more that strong signs of bad ethics are important signals and your example of Kerry is perhaps a good one. Imagine if people had concerns with someone on the level of what they had with Kerry (plausible in the case of SBF—that is what it important to find out) and despite that promoted Kerry to be one of the few faces of the movement. That would be problematic and it is important to figure out what happened. That’s not a witch hunt.
Also, it is very surprising to me that you don’t hold the belief that many people treat e.g. Will as a central figure to defer to. Given what you’ve written it sounds like you are quite central so maybe get exposed to different people who have more eye-to-eye contact with Will, but it is certainly my experience that most people confer star power on Will. You are right that that doesn’t make it Will’s fault. I am just trying to claim that I don’t think you claim that people don’t treat him like a philosopher king hasn’t been my experience at all.
To be clear, this was always known, just never acted upon.
Well, yes, that I do agree with
Having read Plato, I have no idea what you mean by this.
Who are these EA leaders? What are their names and what do they do? Who directs what? How sure are you that you aren’t just imagining that there must be some adults in charge somewhere?
EA is anarchy. No one is even a little in charge. Some people get asked for advice more than others. Some people have ideas that are more influential than others. No one knows what they are doing, but all the constructive EAs are trying to do things and while also asking around to shore up their epistemics as they do so. EA is a workshop where everyone has their own projects but is free to ask the old hands for tips or to spy on the cool new up-and-coming projects and to try to emulate them.
EA forum, Twitter, EAG talks, etc are not real life. Will is a mascot. At best an “influencer.” If you want to shape the future (of the world, EA, the British government) you have to ground yourself in what is actually going on, not just the chattering in places like the EA forum. Be constructive, be nice, support others, and start building some stuff of your own.
I don’t think that’s true. I’ve worked at CEA myself, and I know that CEA wields considerable influence.
I also think your way of discussing is inappropriate.
The monarchy wields influence. That is not the same thing as control.
I will go back and edit this. I’ve clearly crossed the line by more than I realized. Like a lot of people, I’m fraying with this one. It feels like all of the worst folks in EA publicly (and permanently!) shaming all of the best. Break ups don’t hurt this much.
Not going to lie, I understand you’re emotional, but being referred to as “all of the worst folks” in EA, and your previous comment before you edited wich was exceptionally rude, is honestly one of the worst interactions I’ve had in this community.
I have volunteered hours of my time running seminar programmes, writing curricula, helping organise speaker events and Q&As. I have spent literally 100s of hours consuming EA content and trying to compile EA content for people tgo use. I have orientated my research around X-Risk, spent an entire summer at CERI, taken a year off from university to research X-Risk, spoken at EAGx Rotterdam, and tried to make sense of the world to reduce X-Risk. Sure, I’m not perfect, and critiques of my approach and my criticisms I am sure I am valid, and understand this is emotional and hard for you, and I apologise if my critiques have made it harder- I still tink its time to have such difficult conversations, but I do apologise if this is hard. To be called “one of the worst folks in EA” and being accused essentially of “not actually doing stuff” is really upsetting when I really am trying my best, so I do hope you will apologise
I don’t usually like responding to these sorts of comments as it is rarely worth it, but:
I truly hope your usual method of arguing is not this ad-hominem, and with emotive language used rather than rational argument.
Edit: I understand emotions are running high, and I see your above comment. Personal attacks, particularly to specific individuals in the above way really aren’t appropriate though, which is why I felt I had to say something here.
Keep in mind that the people who are commenting, writing call-out posts, etc., online are both a) only a small fraction of EAs, and b) a particularly skewed sample in the current moment.
Some of us either chose to or were asked to be quiet for now, or are still reeling from our losses and not sure what to say.
Personally, I’ve mostly seen people confused and trying to demonstrate willingness to re-evaluate what might have led to these bad outcomes. They may overly sway in one direction, but this only just happened and they are re-assessing their worldview in real-time. Some are just asking questions about how decisions were made in the past so we just have more information and can improve things going forward (which might mean doing nothing differently in some instances). My impression is that a lot of the criticism about EA leadership are overblown and most (if not all) were blindsided.
That said, I haven’t really had the impression it’s as bad and widespread as this post makes it seem though. Maybe I just haven’t read the same posts/comments and tweets.
I do think that working together so we can land on our feet and continue to help those in need sounds nice and hope you’ll still be there since critical posts like this are obviously needed.
Thanks for all your hard work in EA!
I think you (and lots of other EAs who feel the same way you do) are totally correct that you don’t deserve the response you’ve been seeing to the FTX situation. You deserve a huge pat on the back for doing so much for the world.
Separately, I also agree with these paragraphs Oliver wrote a few days ago, and I’m (tentatively) glad that there’s been more criticism than usual on the forum right now (even if it’s ultimately unrelated to FTX):
I know it may feel like someone “curb stomping you while you’re on your knees”. But in many cases I think a better model is (a) people doing public soul-searching or (b) people who were previously self-censoring no longer doing so.
I want a world where people feel appreciated for hard work, and (simultaneously) people feel comfortable in attempting constructive criticism, safe in the knowledge that their criticism won’t have the sort of negative career repercussions your post implies. Here’s my attempt to reconcile those two objectives; this goes out to all the EAs who are feeling burnt out right now:
It sounds like you’ve been working really hard at a job you hate, doing a lot of good. Maybe sometimes you think about taking a vacation or searching for a job that’s more fun, but avoid it because of opportunity costs.
If you’ve been doing that, I want to push back. You deserve a vacation. A nice long sabbatical, even. Not only do you deserve it, it seems justified on consequentialist grounds—I think the opportunity cost of your vacation will be less than the cost of the ingroup-hardening process you describe in your post. (Convenient that consequentialism sometimes calls for vacations, isn’t it?)
In conclusion, please know that your work is appreciated, and please take care of yourself!
I am sorry you’re so upset. As someone that has not dedicated themself to EA like you maybe I can’t understand your anger.
However, I take exception to using the term curb stomp. Like I get your whole post is deliberately hyperbolic but curb stomping is a horrifically violent act perpetrated by neo-nazis to cause permanent disfigurement to their victims. https://en.m.wikipedia.org/wiki/Curb_stomp
It doesn’t accurately describe what has happened even metaphorically and it is certainly not amusing. I think a lot of the comments (mainly on Twitter) were uncharitable and unkind when Will put out his statement but yeah I just object to that term and found it inappropriate and distasteful.
I love the rest of this post but I agree with you here.
Thank you. Didn’t know. Will change it to “jump on them.” Sorry friend.
This. I know that dilution of our original ideals is the price we pay for growth so perhaps I shouldn’t have expected much better by this point, but it’s still so sad to watch the community jump at the chance to attack some of the people who put so much into starting and building this movement. I think the level of hindsight bias, mob mentality, and lack of empathy is pretty shocking.
Thank you for your ten years of hard work and for swimming against the tide today.
And +100 to the shout-out to Julia Wise and Cate Hall.
This paragraph really resonated with me. I suspect many people whom we would benefit greatly from having in our community are turned off because of they got the same feeling you articulated here.
I’m finding difficult to articulate why I think this is, but let me attempt:
When I’ve been at my least productive, I find myself falling into a terrible zero-sum mindset of actively searching for things that are unjust or unfair. My thoughts often take the shape of something like:
On the other hand, when I’m at my most productive and fully immersed projects that matter to me, I don’t ever find myself thinking those thoughts. I’m too focused on actually getting things done and producing surplus to care about how others spend their time and resources.
In this mindset I’m incredibly optimistic and I intuitively feel that any problem solvable if I put my mind to it. In the former mindset, everything seems doomed to fail and I want to sneer at anyone who thinks otherwise.
These mindsets feel very distinct, and it’s very clear the latter is highly conducive to success and the former is actively harmful. If somebody with the latter mindset gets their first impression of EA from people with the former, I don’t blame them for bailing.
I didn’t notice this comment and I think it’s excellent. Thank you so much for sharing.
I have not seen any of this on the EA Forum[1]. I imagine Twitter is a lot more toxic (as usual) though.
I get that you are angry. I’m sorry, and I wish you well. I’m angry from the other side—I’m seeing too much wagon circling, and not enough willingness to engage or change from the people at the centre. I’ve also invested a decade into EA, and I am upset with how things are going post-FTX Crisis.
This literally made me laugh out loud (you’re a good writer!) :)
Anyway, I think I need to get off the EA Forum for a while. I’m going for a weekend away and will tell my partner to stop me if she sees me on the Forum!
Apart from maybe dating habits—but the point there isn’t the dating habits, it’s the conflict of interest they cause if they also involve professional relationships!)
I disagree with your assessment of the reactions in EA (I can recognize some symptoms that you mention, but it doesn’t seem to be the majority of people), but I thank you for helping the world, and I hope your disgust will diminish a bit and you will want to continue to participate in EA.
One view is that SBF’s actions are a sample of one and therefore cannot justifiably be the premise for the kind of wide-ranging criticisms fielded against effective altruism on this forum and elsewhere over the last week. I think this is wrong-headed, however. The point is not that SBF’s fraud in and of itself proves the existence of these problems, it is merely that it highlights problems which can be independently justified. For example, it highlights that that this is a fairly hierarchical movement that vests a lot of power in the hands of a small number of people, with few mechanisms to hold them accountable and transparent to the community at large.
Across a wide range of domains, crises are productive moments where the legitimacy of the existing order of things breaks down and there is a space for new, open and critical thought. If you are happy with the current incarnation of the movement no doubt that is regrettable, but clearly, many people are not. Obviously some fraction of criticism—this is the internet, after all—is not thoughtful, constructive or made in good faith. But to say that criticism at a time of crisis is equivalent to seeing someone on their knees and curb stomping them is, I think, an extremely dangerous attitude. What you are effectively advocating for is a rally-around-the-flag effect.
I agree that some people are being too harsh on the ‘leaders’ of effective altruism, and especially Will. He was one of a great many people who were deceived. I don’t think the problem is one of his or any one else’s individual character. I think the problem is how the movement is organised. I would add that how it is organised is probably not entirely independent from utilitarian and rationalist philosophy.
Hi, it’s bad to hear that you feel this way, and I can understand why you have this sort of sentiment. A lot of emotions are running high right now.
But what I have not seen mention of here:
I have not had funding effected by this (apart from reducing future potential funding sources), so I am saying this as someone far less effected than most. But a lot of people had their funding sources part or all from FTX, whose funding is now uncertain. There has been no guarantee that all promised grants will be fulfilled as far as I am aware, even through another funding source like Open Phil as they mentioned raising their criteria. There will be people, as a direct result of this, who do not know whether their work can continue or if they lose their job/business/other project, or whether past money that they spent, having received in good faith while trying to have the highest positive impact they could, may even be clawed back. Some of these people will be students, or people early in their careers, who are less likely to have savings to fall back on, or in the event of a clawback may have their savings wiped out (I have no idea of the actual likelihood of this, but I have seen this discussed on the forum). Who perhaps have not been given a clear answer, by other people in the movement who they may have been expecting an answer from but who are instead remaining silent (although there might have been private communication that I am unaware of). My sympathy lies with them the most, and I do not blame them (or others) for potentially questioning people who perhaps could have known more (not saying anyone did or didn’t, I wouldn’t know beyond what is in the media).
And that’s of course just the people affected within the EA community. That’s not mentioning the hundreds of thousands or millions of customers who were directly stolen from, many of whom lost significant amounts of money or even their life savings. That’s not mentioning people who will be effected in future fallout.
“This has included Will MacAskill and other thought leaders for the grave sin of not magically predicting that someone whose every external action suggested that he wanted to work with us to make the world a better place, would YOLO it and go Bernie Madoff. The hunt has included members of Sam’s family for the grave sin of being related to him.”
I think it is fairly natural to question if certain people knew more than most, when it externally seems like Will MacAskill may have been some form of mentor to SBF for nearly a decade (I know nothing here, just going by the media for this one, and it seems likely that this may have been overstated in media as it makes a good story). Family members can plausibly know better than that, and of course nobody should be threatened, but luckily I have not really seen this within the EA community.
“It has included attributing the cause of Sam’s actions to everything from issues with diversity and inclusivity, lack of transparency in EAG admissions, the pitfalls of caring if we all get eviscerated by a nuke or rogue AI, and, of course, our office spaces and dating habits.”
Dating habits can very much be a conflict of interest (Google it), particularly if it is likely to influence something like the willingness to provide *multi-billion dollar fraudulent loans* due to this person being someone you are (allegedly) dating..
“But why exactly should I help those in the community who believe that the moral thing to do when someone is on their knees is to curb stomp them while yelling “I should have been admitted to EAG 2016!”?”
Who’s actually done this? Source? And the main ‘curb-stomped’ people, if you can call it that, are literally 1) someone who has committed multi-billion dollar fraud more likely than not it seems (and some of his inner circle) and 2) someone who at this point is likely the main public face of the movement who said fraud claimed (in many past interviews) motivated him (and public figures should not be above question). Innocent until proven guilty of course, but even if the odd person takes things too far, that is not indicative of a movement wide ‘witch hunt’.
A lot of people’s emotions are high right now, and remember that when reading other people’s comments, the same way they should remember when reading what you write including this post.
(And in fairness to Twitter, it has been more balanced than I was expecting (considering my base rate for expected Twitter discourse is basically people screaming/a witch hunt, you don’t go to Twitter for reasoned debate). People appear to be defending EA, and that includes people from the public.)
Reposting my penultimate paragraph as it is important and in case people don’t otherwise read that far:
A lot of people’s emotions are high right now, and remember that when reading other people’s comments, the same way they should remember when reading what you write including this post.
I think it is appropriate for the movement to reflect at this time on whether there are systematic problems or failings within the community that might have contributed to this problem. I have publicly argued that there are, and though I might be wrong about that, I do think its entirely reasonable to explore these issues. I don’t think its reasonable to just continually assert that it was all down to a handful of bad actors and refuse to discuss the possibility of any deeper or broader problems. I like to think that the EA community can learn and grow from this experience.
I’ve been thinking about a conversation a friend of mine and I had while reading the book Unmask Alice, about the author Beatrice Sparks and how the books she passed off as authentic diaries contributed to the Satanic panic and the war on drugs. Specifically, we were trying to work out how to avoid making the terrible mistakes multiple grieving parents made and years later ended up deeply regretting, both in connection to Sparks and in cases we’d heard about elsewhere (at least one former advocate for the sex offender registry falls into this category). The place we got to in that conversation was that the kind of overwhelming grief felt in the immediate aftermath of tragedy makes it harder to make measured choices. It is easy to want to stamp out the problem forever, as quickly as possible, because the horrible thing that happened cannot be allowed to happen ever again. But that is not a good mental state to be in when trying to change the world.
I keep thinking about it because I feel like I’m seeing a similar pattern shaking out here. What FTX did was unacceptable, and it cannot be allowed to happen again. We have to find what allowed this and take quick decisive action about it, because anything less might let another FTX rise up. I can understand this impulse, and having seen my share of unresolved drama in related communities I too feel the fear that if we don’t do anything now then nothing will ever be done. But I don’t trust a lot of the solutions being floated right now. I think people have a sense that something must be done, and are reaching for anything that might be something. I just hope that we manage to come out the other side somewhere sensible.
Personally I’d rather be kicked when I’m down. Better to deal with all the pain at once. Maybe there can be a forum feature that can allow a person to tune the percentage of critical posts they see based on sentiment analysis and those who can’t handle all the vitriol at once can take it in little bits at their own convenience.