Epistemic status: I may regret writing this, but will likely regret not writing this more.
The FTX blowup might be a bad judgement call, not willful fraud
Loyalty is good, and y’all are way over-updating
Ambition is good, and failed bets are worth celebrating
1. Fraud or bad judgement?
Never attribute to malice that which is adequately explained by stupidity — Hanlon’s razor
Here’s my model of what happened: SBF was busy. Things were confusing. He made a lot of decisions. One of them blew up.
Why do I believe this “bad judgement” thesis? Well, it’s what Sam claims. Publicly on Twitter, his explanation comes out to: “I genuinely believed customers were not leveraged, and that we could pay out all deposits”.
I’m biased towards trusting people at their word; and furthermore, SBF was a personal hero of mine. So you might not extend him the same trust that I do. But: Sam doesn’t have a history of lying on the record. Even at this point in time, when the muckrakers of the world are busy scrutinizing every deed he’s done, I don’t see any allegations of the form “he lied about this thing”. “Truthful, but mistaken” seems like a better model for SBF than “masterful schemer”. I choose to believe this. Maybe I’ll be wrong; feel free to bet on whether I’ll change my mind. (This comment does lead me to some level of doubt.)
Why do I emphasize the difference between fraud (roughly, taking funds from customers with full knowledge) and a bad judgement call? Because “don’t do fraud” is a good heuristic to propagate, but “don’t make mistakes” is not.
It’s really, really easy to pick apart other people’s mistakes, especially after the fact. If you’re a public figure making 10 good decisions and 1 bad one, a critic can jump in with “look at that mistake! It was such a mistake! I would never had made that mistake!” Oftentimes, the critic is even correct! But: in the same position, they would have made 2 other bad decisions, ones that weren’t even on their radar. From Zvi Mowshowitz on what would be difficult about being “in charge” during Covid:
…You can do better by taking a market price or model output as a baseline, then taking into account a bias or missing consideration. Thus, you can be much worse at creating the right answer from scratch, and yet do better than the consensus answer.
Think of this as trying someone’s fifteen-step chocolate chip cookie recipe, then noticing that it would be better with more chocolate chips. You might have better cookies, but you should not then claim that you are the superior baker.
If you, right now, are sitting on a high horse, saying “it’s so obvious; just don’t take customer deposits and gamble with them”… guess what, you do not understand how complex systems break. There is no special ability to know which of your many decisions might be the one that causes it all go kaput. I’ve witnessed and responded to my fair share of outages at Google and Manifold, and very often dumb things led to these outages— but you’ll never know which of your actions are the dumb ones. While you were editing a config file that later broke the site, you didn’t sit back and think “hm, maybe this will crash the site, I should look at it carefully”. That change looked no different than the twenty other changes you made this week, all of which were fine and good.
2. On loyalty
Whether this blowup was caused by intentional deception or an honest mistake, the EA community has been extremely quick to change its tune. In less than a week, everyone has gone from “SBF, golden boy” to “What a criminal, don’t be like that guy”. EA leaders have posted denunciations of fraud, and distanced themselves from Sam; the most upvoted all-time EA Forum post is a community condemnation; the entire Future Fund team up and resigned.
To be fair, the rest of the world is dogpiling on him too. Elon Musk called Sam “full of shit”; Sequoia deleted their glowing profile of SBF published one month ago; Miami took the FTX name off their arena.
But from EA folks, this behavior strikes me as cowardly, coldhearted, opportunistic, bandwagon-y, two-faced and distasteful. I am extremely confused, because these EA leaders are some of the smartest and “good-est” people in the world, whose work I respect and admire, who have shaped the way I think. So it’s very possible that I’m just in the wrong here, but…
Whatever happened to loyalty? To supporting those who have helped you in the past? Are EA folks only fair-weather friends, happy to accept your money in good times but also ready to eviscerate you to maintain deniability and a glossy PR sheen? What distinguishes altruism from selfishness is “being good to people who cannot help you back”. It is extremely suspicious that EA as a whole was happy to laud praise on SBF while the money was flowing, and then turn their backs as soon as it was clear the gravy train dried up.
Having a scout mindset is good; updating your beliefs about SBF in light of new evidence is good; but there is such a thing as updating too far. Sure, the community should call out the bad, but have we up and forgotten about every good thing that SBF have accomplished, especially for EA? Every point in the deleted Sequoia article is still true.
Is a committed vegan
Earned to give while at Jane Street
Led CEA for a few months
Sent money to Ukrainians in time of need
Incubated hundreds of millions worth of good longtermist causes through Future Fund and related spending on eg Anthropic
Take a step back: what are we assessing here? If the question is “should I associate with a person who has this track record, and also once fraudulently misused customer money, but has repented and is trying as hard as possible to fix it”… the answer seems like a clear yes to me. And as a prosaic consideration, I continue to believe that Sam and the rest of the FTX leadership team are extremely talented and aligned people. Even with tarnished reputations today, I expect them to accomplish good and great things. To jump immediately to cutting ties seems like a large strategic error.
And on a personal note, I aspire to create a lot of value for the world, and direct it towards doing lots of good. Call me overconfident, but I expect to be a billionaire someday. The way EA treats SBF here sets a precedence: if the EA community is happy to accept money when the going is good, but then is ready to cut ties once the money dries up… you can guess how excited I would be to contribute in the first place.
3. On ambition
Imagine a world in which things had gone a little differently. In World 2, CZ never triggered a bank run on FTX because he got locked out of his Twitter account. Alameda repays its debts and continues on to print money. In 2025, FTX is stable and worth hundreds of billions as the world’s largest online brokerage — and then the news breaks that three years ago, Sam willfully took a risky gamble using customer funds to keep FTX and Alameda both afloat. What would your reaction be? Would you denounce fraud, demand that customers be compensated (how much?), ask Sam to step down?
Fred Smith, the founder of FedEx, famously gambled his company’s entire bank account at a casino in order to keep deliveries going:
I asked Fred where the funds had come from, and he responded, ‘The meeting with the General Dynamics board was a bust and I knew we needed money for Monday, so I took a plane to Las Vegas and won $27,000.’ I said, ‘You mean you took our last $5,000-- how could you do that?’ He shrugged his shoulders and said, ‘What difference does it make? Without the funds for the fuel companies, we couldn’t have flown anyway.’ Fred’s luck held again. It was not much, but it came at a critical time and kept us in business for another week.
Of course, these two stories aren’t exactly the same; betting investor/company money is different than betting money entrusted to you for other purposes. But I can’t help but think that if SBF’s plan had worked, and he was still EA’s rich uncle financing our ventures, we would be applauding him for bravado and wisdom in making that call. It feels like EA is punishing SBF not for being unethical, but for being unlucky.
(Crucially: I think that it is correct to consistently support him in our world and World 2. You can also be consistent by saying that EA should denounce him in both worlds. But if you believe the latter — tell me, how much did you know about crypto, exchanges, or trading firms before last week?)
Risk-taking and ambition are two sides of the same coin. If you swarm to denouncing risks that failed, you do not understand what it takes to succeed. My very subjective sense of people in the EA community is that we are much more likely to fail due to insufficient ambition than too much risk-taking, especially without the support and skillset of the FTX team.
Disclaimers: Manifold received a $1m investment and $500k grant through the FTX Future Fund. Our team spent a couple weeks in the Bahamas as part of the EA Bahamas Fellowship program, including meeting SBF in person at a party in his penthouse. We may have exchanged a couple dozen words; I do not know him personally. All opinions here are my own.
Responses to this situation I endorse:
In favour of compassion, and against bandwagons of outrage by Emrik
This is a little weird, but I do feel like I ought to disclose a bias here, which is that I like Sam Bankman-Fried. I have done a few podcast interviews and events with him, and I have always found him likable, smart, thoughtful, well-intentioned and candid. That is not in any sense investing advice or whatever; it’s just how I feel. I am rooting for this all to work out for him and FTX.
My emotional conflict of interest here is that I’m really f#%king devastated. I never met or communicated with SBF, but I was friendly with another FTX/Alameda higher-up around 2018, before they moved abroad. At the time they seemed like a remarkably kind, decent, and thoughtful person, and I liked them a lot. I desperately want to believe they didn’t know about the fraud, but it seems really implausible. If they did, then I genuinely have no idea what happened, and I hope the investigation finds some reasonable explanation, like that they were doing so many stimulants and psychedelics that the DMT entities were piloting their body like an anime mech. I probably shouldn’t exactly say “I hope they’re okay” when there are so many victims who deserve okayness more. But I hope there’s some other world-branch where they never got involved in any of this and they’re living their best life and doing lots of good, and I hope the version of me in that world branch is giving them the support and reassurance that I can’t give them here.
More generally, I trusted and looked up to the FTX/Alameda people. I didn’t actually keep money in FTX, but I would have if there had been any reason to; I didn’t actually tell other people they should trust FTX, but I would have if those other people had asked. Lower your opinion of me accordingly.
Suggested reading: Oshi no Ko chapters 24-26.
Thanks to Sinclair, Rachel, Jack and Lynelle, along with many others, for discussions on this topic.
This article from The Wall Street Journal suggests that what happened was more like “taking funds from customers with full knowledge” than like a mistake:
(See also this article by The New York Times, which describes the same video meeting.)
There are other signs of fraud. For example:
Reuters reports that FTX had a “backdoor” which “allowed Bankman-Fried to execute commands that could alter the company’s financial records without alerting other people, including external auditors,” according to their sources.
On November 10, the official FTX account on Twitter announced that FTX was ordered to facilitate Bahamian withdrawals by Bahamian regulators. Days later, the Securities Commission of the Bahamas claimed that that was a lie. As Scott Alexander put it, “this might have been a ruse to let insiders withdraw first without provoking suspicion.”
FTX’s legal and compliance team resigned very early in this. As Matt Levine pointed out in one of his articles about the debacle (https://archive.ph/OER98), this probably means that they were not aware of what was going on.
Prior to this, FTX and Alameda insisted that they were at “arm’s length” and that Alameda did not get preferential treatment at FTX. However, according to The New York Times’s sources, Alameda CEO Caroline Ellison “had been sitting within view of computers displaying [FTX]’s trading data,” despite Alameda being supposed to operate in a different office. Moreover, Mr. Bankman-Fried was also involved in Alameda, according to the NYT, “contributing to the decision-making on big trades.”
I’m leaning toward this explanation too. But these SBF tweets are probably meant to suggest something like the following:
Alameda used the standard spot margin lending that anyone can use to borrow the $10b from users who explicitly opted into this feature.
Alameda’s collateral dropped in value and people withdrew from FTX.
The liquidation engine couldn’t close the enormous position in time, and the backstop liquidity providers couldn’t handle it either.
FTX, for the first time, had to do a clawback – something that they try hard to avoid but reserve the right to do.
All in all that seems unlikely to me. There’s this talk of a backdoor to hide transactions from accounting and auditors, and the spot margin approach would’ve generated interest payments to countless users, which would’ve been hard to hide. (But maybe that’s telephone game, and the backdoor is simply the invite-only backstop liquidity provider program or whatever. Seems unlikely though.)
But, at least without some complex backdoor, such an enormous increase in demand for lending would’ve increased the interest rates, and the total size that is lent out is also public.
Here’s an aggregation of the total size in USD and the size-weighed average of the interest rates across some 16 major coins over time. (Pulled from my private copy of the data.)
The interst rate is a bit spikey, but nothing major, and the total size hovers around $4–5 billion. And additional $10 billion would be obvious as an enormous step up by ~ 3x. Plus, when I lent out USD, I was typically immediately matched with borrowers, indicating that the lending is the bottleneck for USD stablecoins. So it might not even have been possible to borrow $10 billion. That doesn’t hold for BTC and ETH though, where there were (I think) more lenders than borrowers.
I’ve cut off the last few days when people started withdrawing because the interst went up a lot. (I would’ve had to use log scale to keep the usual fluctuations in the interest rates visible.)
The first half of this post reads like “I trust he didn’t know because he said so and I choose to trust him.” That’s not convincing at all. Since you talk about incentives of “disloyalty” on future actions, I want to also flag that being naive / too trusting encourages more fraud.
In the second half of the post, you assume it was indeed fraud but you say maybe it’s okay because we’re only in this situation now because he got caught. Which one is it? That makes me think “it sounds like you were going to defend him no matter what came out.”
I like the FedEX story. I think it’s quite different from the FTX situation, though. The stakes were so much lower (even if we adjust for inflation) and getting caught was an acceptable risk. By contrast, this FTX situation is so far beyond anything that is redeemable when you get caught that you should never have enough confidence in practice to even attempt it, even if we’re for the moment assuming act utilitarianism.
You ask us to envision a world where Sam amassed 300 billion and it’s all legit and secured because crypto (or other investments) hit a golden run. Okay, but what about the world where it looks like he he amassed 300 billion, but then it all implodes and he’s 100 billion on the hook instead of 10? It seems like, based on what we know about how he operated, that world is more likely than the successful one.
Yeah, perhaps I could have been more clear in my argumentation structure. Point 1 is a consideration on the object-level: was it willful? But points 2 and 3 assume that even if it was willful, the community response goes too far in condemnation, and condemnation without regard for loyalty/ambition might hurt its ability to actually do good in the world.
I think we should definitely not be loyal to people who commit massive fraud, or praise ambitious destruction! I know “stand fully by my people no matter how right or wrong they are” is a common moral stance but I think it’s enormously wrong and destructive. It’s an important virtue to support things that are good and not things that are bad, even if we’re very attached to them. (Also, like, I think SBF betrayed “us” first.)
(Sorry this is more of a skeleton of an argument than an actual argument, I keep meaning to write out more of my thinking here and not finding time)
or like, if you’re close with someone who did a significant bad thing and is now facing significant consequences for it, it can make sense to be loyal in the sense of—trying to help them make it through this time, trying to not make things worse for them. but not in the sense of denying or defending their wrongdoing.
Strongly disagree with most of this but upvoting because I am actively in favor of people laying out their reasoning for unpopular positions so that people can engage with the reasoning directly, rather than only operating only on the level of “this is wrong and you should feel bad”.
I disagree with this policy, for what it’s worth. I think you should upvote high quality posts, where the author uses good reasoning in favour of a conclusion you disagreed with or hadn’t considered, which you think deserve more attention—not just every post that makes an unpopular argument!
Yeah I wouldn’t upvote every post that makes an unpopular argument. But I upvote posts that I want to encourage on the margin, which includes posts that are wrong but better to make than not to.
Thank you for posting this. I haven’t yet read through the whole thing yet, and I don’t necessarily agree with it, but I think it’s important that people feel comfortable expressing their opinions here. The fact that within minutes of posting this has gotten −8 votes is something I find concerning, as I doubt those people have even had time to read and process what you said before voting and I suspect they’re voting based on anger and groupthink. I hope the community will be able to have a productive conversation in these comments.
Yeah, reading further, I definitely don’t agree with a lot of these claims. But the fact that I feel like I have to post this clarification in order to avoid getting downvoted myself is something I think needs to be talked about. The original post is now down to −15, and I haven’t even finished reading it.
I see it’s now up to +18, which is promising. Implies that people who vote without fully reading the post are more likely to downvote than upvote.
Risk taking and ambition are two sides of the same coin when the parties who stand to bear the downside are the ones who benefit from the upside, and can consent. Appropriating user funds to bear the downside risk without their knowledge is not ambition, it is theft and not morally acceptable. If for example, Alameda had collapsed due to trading losses, but customer deposits on FTX were untouched, then it would be an entirely different matter.
This seems like an odd post to me. Your headline argument is that you think SBF made an honest mistake, rather than wilfully misusing his users’ funds, and most commenters seem to be reacting to that claim. The claim seems likely wrong to me, but if you honestly believe it then I’m glad you’re sharing it and that it’s getting discussed.
But in your third point (and maybe your second?) you seem to be defending the idea that even if SBF wilfully misused funds, then that’s still ok. It was a bad bet, but we should celebrate people who take risky, but positive EV, gambles, even if they strongly violate ethical norms. Is that a fair summary of what you believe, or am I misreading/misunderstanding? If it is, I think this post is very bad and it seems very worrying that it’s currently got +ve karma.
Specifically, he talks about while the heuristic “don’t do fraud” is a good heuristic to have, “don’t do mistakes” is not a good heuristic at all, and this is trivially true.
We can’t expect people to be perfect and never make mistakes, so why are you disagreeing with this.
I think your latter points are better supported than your first point and I hope people keep reading long enough to get there. Additionally, I am upvoting this post because I agree with the lessons you are trying to share regardless of whether we should apply the lessons to the community’s reaction to SBF specifically (I think we probably should, but like, probably not do a 180 degree pivot to be all fluffy rainbows about it and act like it was okay).
Anyway, I’m just happy to see someone talking about loyalty, mistakes, other worlds, compassion, and stuff. These are important issues. And even if they were not that important, and should be just a small weighting of how we should decide how to react to SBF, they still should be factored in yet so far no one has brought them up.
And yeah it sucks that me saying I upvoted this post means I might get downvoted. Hold your trigger fingers plz :(
I just wanted to mention that this comment tripped my “bravery debate” detector. I still upvoted it because honestly the bravery debate framing seems correct here, and I said something similar in my own comments earlier. But then again, everyone who engages in bravery debates thinks their framing is accurate. So let’s be careful not to give posts additional weight just because they’re speaking against majority EA opinion.
No, I’m just used to, as a woman, buttering most comments up (irl and online) in unnatural ways to not be seen as a bitch or low-intelligence or a clueless outsider. Right now I’m tired so maybe I over-corrected here, but living life in that way does cause anxiety, so that’s also a genuine anxious tone you’re catching. I read the other comments and they are getting upvotes when they clarify that they don’t really agree with the post or like it. I think I agree with and like the post more than the other commentors and have been considering writing similar.
It sucks that there is pretty much always someone ready to thumbs you down now matter how you word things, and it feels reasonable to spare a few words in the cases where it is most likely. (and this comment has no anxious tone because I see that is frowned upon here even though I’m now actually more anxious about this comment than the first one)
Very reasonable! I understand you feel like you have to walk a fine line in order to not trigger social disapproval of your words; I think that’s bad, and to be clear, I did not mean to make it seem like I disapproved of your comment. I wish EA could be a place where everyone felt comfortable speaking naturally without having to add a bunch of disclaimers.
Thank you—I think you did a good job of capturing what I was trying to say. we shouldn’t go full fluffy rainbows, and we should directionally update against SBF compared to before FTX imploded; but what I’m seeing is way overcorrected and I’m trying to express why.
Thanks for having the courage to write this. Regardless of whether it’s correct, it is good to have the position represented and it is much easier in the current environment to take the other side on this.
I applaud that you wrote how you feel against social incentives.
It seems to me that the main way for our community to avoid allowing future devastating mistakes like with SBF/FTX is to have more posts like this and norms that encourage dissenting opinions and go against hype (anti-hype?).
Especially if it’s true that people had heard rumors about some problems or had some reasons to act on pieces of information in regards to SBF character, but silenced themselves. Punishing socially these kind of posts seems like recreating the environment for such moral and truth-seeking failure.
On a relevant note, it’s a bit problematic that main posts don’t have disagree voting though, because maybe people vote on whether they agree and don’t necessary want to punish you for expressing your feelings.
This is a weird paragraph. If your goal were doing the most good, why would it matter how you expect EA to treat you in the case of failure? It kinda sounds like your goal is social status among the EA community.
This isn’t to say that you don’t have a good point. If people are donating to EA because they want social status, that’s still money going towards good causes, and perhaps we should reward them for that in order to encourage more people to do so. But I’d have a hard time calling that “altruistic behavior” on their part.
Because he’s a human being and human beings need social support to thrive. I think it’s false to equate this perfectly fine human need with a lower motive like status-seeking. If we want people to try hard to do good we as a community should still be there for them when they fall.
I don’t think it’s either/or. I think it’s consistent for Austin’s philanthropy to be primarily motivated by altruism and for him to also feel scared of the prospect of his community turning on him when he makes a mistake, perhaps to the point of putting him off the whole idea completely. And I’d expect most EAs to have a similar mix of motivations.
Yeah, idk, it’s actually less of a personal note than a comment on decision theory among future and current billionaires. I guess the “personal” side is where I can confidently say “this set of actions feels very distasteful to me” because I get to make claims about my own sense of taste; and I’m trying to extrapolate that to other people who might become meaningful in the future.
Or maybe: This is a specific rant to the “EA community” separate from “EA principles”. I hold my association with the “EA community” quite loosely; I only actually met people in this space like this year as a result of Manifold, whereas I’ve been donating/reading EA for 6ish years. The EA principles broadly make sense to me either way; and I guess I’m trying to figure out whether the EA community is composed of people I’m happy to associate with.
I have seen a growing trend among EAs disregarding the law and due process. I have also been told not to be a legal alarmist when I have raised concerns about how a prog, event or grant is administered. I will not get into specific examples, but I think a culture where the thought process is ” law and procedure should not come in the way of doing good ” or ’let’s weigh the risk of not following v/s the possible good” creates a situation where decisions are made with brashness. I think the situation with SBF is less to do with thoroughly thought-out fraud to more with brash decisions being made.
We have created demigods in the EA world, which set the tone and tenor of how things are done. This event with FTX is a good reminder for us as a community to reset.
What is “due process” here in your view?
The classic minimum formulation in a legal sense (Mullane) is notice and a reasonable opportunity to respond. SBF resigned on the 10th and I am sure the new CEO has kept him far away from any work. So he has had several days to amplify his initial explanation.
I think you mean “waiting for more evidence to come in” which I would agree with if I found his explanation plausible. But the explanation isnt that much more plausible to me than “because the moon is made of blue cheese.”
I’m glad you wrote this. I’m with the vast majority of people in thinking it extremely unlikely that Sam just made an honest mistake, but given that you think he did, I’m glad you wrote this.
I think this comment re urgency is very relevant here. I think that when you suddenly have a lot of people briefly looking at you who know nothing about you beyond your association with something negative, it’s reasonable to just make very basic points very clearly while being very mindful that many people will try their hardest to take whatever you say out of context to prove that their initial assessment of you was correct.
Why? Hanlon’s razor is an overused and rarely argued for article of faith in the EA / rationalist sphere, which sounds wise, but could easily be wrong.
Moreover, it’s best used as a heuristic, and should be defeasible by moderate evidence.
Mistake theory is not inherently superior to conflict theory; it’s an empirical claim.
Maybe Hanlon’s razor is a piece of ‘reversable advice’, which is useful only if you are already inclined to see malice everywhere. But if you wield it, you’ll misattribute some actions to stupidity when they were actually malicious, because malicious actions can often be explained by incompetence. In fact, they’re designed to look that way, because people don’t like providing inconvertible evidence that they are untrustworthy. So you’ll be taken advantage of.
The more powerful someone is, the more dangerous it is to mistake their malice for incompetence.
Sam’s claims are a very small piece of evidence, since I’d expect to see him claim innocence and defend himself regardless of whether he is malicious or incompetent.
I find these points credulous, and they seem to be the foundation on which you are arguing that FTX was innocent of malice.
I think that unless we update, EA’s culture of trust—that it doesn’t even occur to us to imagine a traitor in our midst—will lead us to pal up with criminal ultra-rich people again and again. (This might not be the first time—with 50% confidence I think the criminality of former EA darling Ben Delo was malicious rather than a mistake.)
I think Sam is a good, wonderful, thoughtful, kind person who lost his way. His motivation was and always has been EA. He got enamored by the game (goal, constraints, rewards ) and hyper focused on winning and - yes—as he says” screwed up”. Let’s give him some grace. He’s in a tough spot & needs us. #SupportSam
I’m personally still reserving judgment until the dust settles. I think in this situation, given the animosity towards SBF from customers, investors, etc, there are clear incentives to speak out loud if you believe there was fraud, and to stay quiet if you believe it was an honest (even if terrible) mistake. So we’re likely seeing biased evidence.
Still, a mistake of this magnitude seems at the very least grossly negligent. You can’t preserve both the integrity and the competence of SBF after this. And I agree that it’s hard to know whether you’re competent enough to do something, until you do it and succeed or fail. But then the lesson to learn is sonething like “be in constant vigilance, seek feedback from the people who know most about what you are trying to do”, etc.
Also, loyalty is only as good as your group is. You can’t use a loyalty argument to defend a member of your group when they become suspect of malfeasance. You might appeal to the loyalty of those who knew them best and didn’t spot any signs of bad behavior before, but that’s only a handful of people.
I’m not sure that the mistake vs. fraud angle is very meaningful beyond the question of is SBF wrong for taking stupid risks or wrong for doing fraud—this is an important distinction, yes, but in terms of how EA should be reassessing it’s own frameworks and norms, I think a number of important issues are the same in both cases.
I feel like this scenario should also be the cause for a significant reassessment of the people involved and community norms, oversight, etc. because there’s a point where mistakes being made, in this case more or less unilaterally, at such a huge scale should be made as difficult as possible, not supported by a “take risks and see what sticks” mentality. There’s a very important difference between crashing a website or printing flyers with a typo and losing investor money. It’s a mistake that leads to serious harm to people that FTX had a legal and moral duty that was not fulfilled, even if it was a mistake. Even if all tings were perfect and a mistake occurred as the result of an act-of-god type fluke, there would still be good reasons to call into question SBF’s capability and judgement as a steward for funds, particularly in the absence of signs that there were meaningful efforts to mitigate, which to the best of my knowledge there are none. Yes, mistakes can be hard to foresee, and yes, they are a part of doing business, but I don’t feel like mistakes of such a huge scale should be just treated as learning opportunities as a lack of necessary and serious consequences disincentivizes taking the necessary care and effort to avoid future mistakes.
If anything, I think this should be a sign that EA and related organizations should probably be keeping their donors, particularly their megadonors, at arms length instead of treating them as golden boy poster children for the movement, allowing them significant influence, and integrating them so deeply into the movement. To me, this is a matter of good governance more so than PR, but it does cut both ways. Allowing megadonors to buy influence and becoming so deeply enmeshed that the fortunes of the donors and the movement are intertwined (both literally and metaphorically) doesn’t just allow for reputation washing, it allows the donor to buy influence and intellectual/ideological control. It also, as evidenced by this post, creates a dynamic where there can be an expectation of “loyalty” to donors despite what appears to be blatant fraud because of their previous donations and involvement. No one should be able to buy influence over or the loyalty of a movement that is supposed to empirically guided.
Again, even in the absence of fraud, I would argue that norms should not be to encourage taking significant risks with significant potential downfalls and counterfactuals, particularly when you’re taking risks with other people’s money without being a clear go-ahead to do it. The world where SBF didn’t make a mistake/do fraud and lose 15 billion dollars but played it safe to make and donate 5 billion is the better option, both for EAs and for the creditors of FTX, and for SBF himself. Arguably, the world where SBF continued to just ETG on Jane Street may even a better world than the current one depending on the fallout of the current situation. Cultivating a culture where risk taking and ambition is encouraged regardless of scale is, imo, bad.
Great post and glad to see contrarian takes. (That’s true as a general matter but I also happen to agree with this one :P)
Couple quick thoughts:
1. Loyalty is important not just as a personal virtue but for efforts at collective action because it convinces people to engage in long term altruistic thinking. Hahrie Han has done some important empirical work showing that people are motivated to help when they have a sense of a shared past, shared future. Sudden ruptures in social relationships are very destructive for fostering that culture.
2. Good decision-making generally does not involve dramatic changes to our beliefs. This is one of the less-known aspects of Tetlock’s research on super predictors. They very rarely make large updates but are rather making small updates based on continuous data; they don’t overcorrect when the mob moves. It seems likely to me that this is an example where we can try to put Tetlock’s research to effect. I don’t see the sort of dramatic evidence I would need to change my mind about SBF (though I also probably did not view him as highly as many others; my prior was that Sam was a very smart guy who was well intentioned but going in the wrong direction in life.)
3. Good decision-making requires avoiding the Fundamental Attribution Error. I imagine most people are aware of the FAE on this forum. But I blogged about systemic forces that seem more important to me, in the collapse of FTX, than any of Sam’s personal misdeeds.
Thanks for posting this against the social incentives right now.
My initial reaction to the situation was similar to yours—wanting to trust SBF and believe that it was an honest mistake.
But there are two reasons I disagree with that position.
First, we may never know for sure whether it was an honest mistake or intentional fraud. EA should mostly not support people who cannot prove that they have not committed fraud. Many who commit fraud can claim they were making honest mistakes.
Second, when you are a custodian of that much wealth and bear that much responsibility, it’s not ok to have insufficient safeguards against mistakes. It’s immoral to fail in your duty of care when the stakes are this high.
I think your latter points are better supported than your first point and I hope people keep reading long enough to get there. Additionally, I am upvoting this post because I agree with the lessons you are trying to share regardless of whether they fit SBF specifically (I think they probably do, but I’m actually too tired to say right now and am just happy to see someone talking about loyalty and stuff)