Thanks. In addition to lots of general information about FTX, this helps answer some of my questions about FTX: it seems likely that FTX/Alameda were never massively profitable except for large bets on unsellable assets (anyone have better information on this?); even though they had large revenues maybe much of it was spent dubiously by SBF. And the various actions needed to maintain a web of lies indicate that Caroline Ellison and Nishad Singh, and very likely Gary Wang and Sam Trabucco (who dropped off the face of the earth at the time of the bankruptcy [1]) were definitely complicit in fraud severe and obvious enough that any moral person, (possibly even a hardcore utilitarian, if it was true that FTX was consistently losing money), should have quit or leaked evidence of said fraud.
Four or five people is very different from a single bad actor, and this almost confirms for me that FTX belongs on the list of ways EA and rationalist organizations can basically go insane in harmful ways, alongside Leverage, Zizians and possibly others. It is not clear that FTX experienced a specifically EA failure mode, rather than the very common one in which power corrupts.
I was confused by this until I read more carefully. This link’s hypothesis is about people just trying to fit in―but SBF seemed not to try to fit in to his peer group! He engaged in a series of reckless and fraudulent behaviors that none of his peers seemed to want. From Going Infinite:
He had not been able to let Modelbot rip the way he’d liked—because just about every other human being inside Alameda Research was doing whatever they could to stop him. “It was entirely within the realm of possibility that we could lose all our money in an hour,” said one. One hundred seventy million dollars that might otherwise go to effective altruism could simply go poof. [...]
Tara argued heatedly with Sam until he caved and agreed to what she thought was a reasonable compromise: he could turn on Modelbot so long as he and at least one other person were present to watch it, but should turn it off if it started losing money. “I said, ‘Okay, I’m going home to go to sleep,’ and as soon as I left, Sam turned it on and fell asleep,” recalled Tara. From that moment the entire management team gave up on ever trusting Sam.
Example from Matt Levine:
There is an anecdote (which has been reported before) from the early days of Alameda Research, the crypto trading firm that Bankman-Fried started before his crypto exchange FTX, the firm whose trades with FTX customer money ultimately brought down the whole thing. At some point Alameda lost track of $4 million of investor money, and the rest of the management team was like “huh we should tell our investors that we lost their money,” and Bankman-Fried was like “nah it’s fine, we’ll probably find it again, let’s just tell them it’s still here.” The rest of the management team was horrified and quit in a huff, loudly telling the investors that Bankman-Fried was dishonest and reckless.
It sounds like SBF drove away everyone who couldn’t stand his methods until only people who tolerated him were left. That’s a pretty different way of making an organization go insane.
It doesn’t seem like this shouldn’t be an EA failure mode when the EA community is working well. Word should have gotten around about SBF’s shadiness and recklessness, leading to some kind of investigation before FTX reached the point of collapse. The first person I heard making the case against SBF post-collapse was an EA (Rob Wiblin?), but we were way too slow. Of course it has been pointed out that many people who worked with / invested in FTX were fooled as well, so what I wonder about is: why weren’t there any EA whistleblowers on the inside? Edit: was it that only four people plus SBF knew about FTX’s worst behaviors, and the chance of any given person whistle-blowing in a situation like that is under 25%ish? But certainly more people than that knew he was shady. Edit 2: I just saw important details on who knew what. P.S. I will never get used to the EA/Rat tendency to downvote earnest comments, without leaving comments of their own...
Q: But it’s still illegal to mislead a bank about the purpose of a bank account.
Michael Lewis: But nobody would have cared about it.
He seems to not understand that this does not make it not a federal crime? That ‘we probably would not have otherwise gotten caught on this one’ is not a valid answer?
Similarly, Lewis clearly thinks ‘the money was still there and eventually people got paid back’ should be some sort of defense for fraud. It isn’t, and it shouldn’t be.
...
Nor was Sam a liar, in Lewis’s eyes. Michael Lewis continued to claim, on the Judging Sam podcast, that he could trust Sam completely. That Sam would never lie to him. True, Lewis said, Sam would not volunteer information and he would use exact words. But Sam’s exact words to Lewis, unlike the words he saw Sam constantly spewing to everyone else, could be trusted.
It’s so weird. How can the same person write a book, and yet not have read it?
And it occurred to me that all SBF had to do was find a few people who thought like Michael Lewis, and people like that don’t seem rare. I mean, don’t like 30% of Americans think that the election was stolen from Trump, or that the cases against Trump are a witch hunt, because Trump says so and my friends all agree he’s a good guy (and they seek out pep talks to support such thoughts)? Generally the EA community isn’t tricked this easily, but SBF was smarter than Trump and he only needed to find a handful of people willing to look the other way while trusting in his Brilliance and Goodness. And since he was smart (and overconfident) and did want to do good things, he needed no grand scheme to deceive people about that. He just needed people like Lewis who lacked a gag reflex at all the bad things he was doing.
Before FTX I would’ve simply assumed other EAs had a “moral gag reflex” already. Afterward, I think we need more preaching about that (and more “punchy” ways to hammer home the importance of things like virtues, rules, reputation and conscientiousness, even or especially in utilitarianism/consequentialism). Such preaching might not have affected SBF himself (since he cut so many corners in his thinking and listening), but someone in his orbit might have needed to hear it.
This link’s hypothesis is about people just trying to fit in―but SBF seemed not to try to fit in to his peer group! He engaged in a series of reckless and fraudulent behaviors that none of his peers seemed to want.
(Author of the post) My model is that Sam had some initial tendencies for reckless behavior and bullet-biting, and those were then greatly exacerbated via evaporative cooling dynamics at FTX.
It sounds like SBF drove away everyone who couldn’t stand his methods until only people who tolerated him were left. That’s a pretty different way of making an organization go insane.
Relatedly, this kind of evaporative cooling is exactly the dynamic I was trying to point to in my post. Quotes:
People who don’t want to live up to the demanding standard leave, which causes evaporative cooling and this raises the standards for the people who remain. Frequently this also causes the group to lose critical mass.
[...]
My current best model of what happened at an individual psychological level was many people being attracted to FTX/Alameda because of the potential resources, then many rounds of evaporative cooling as anyone who was not extremely hardcore according to the group standard was kicked out, with there being a constant sense of insecurity for everyone involved that came from the frequent purges of people who seemed to not be on board with the group standard.
Sorry if I sounded redundant. I’d always thought of “evaporative cooling of group beliefs” like “we start with a group with similar values/goals/beliefs; the least extreme members gradually get disengaged and leave; which cascades into a more extreme average that leads to others leaving”―very analogous to evaporation. I might’ve misunderstood, but SBF seemed to break the analogy by consistently being the most extreme, and actively and personally pushing others away (if, at times, accidentally). Edit: So… arguably one can still apply the evaporative cooling concept to FTX, but I don’t see it as an explanation of SBF himself.
Thanks. In addition to lots of general information about FTX, this helps answer some of my questions about FTX: it seems likely that FTX/Alameda were never massively profitable except for large bets on unsellable assets (anyone have better information on this?); even though they had large revenues maybe much of it was spent dubiously by SBF. And the various actions needed to maintain a web of lies indicate that Caroline Ellison and Nishad Singh, and very likely Gary Wang and Sam Trabucco (who dropped off the face of the earth at the time of the bankruptcy [1]) were definitely complicit in fraud severe and obvious enough that any moral person, (possibly even a hardcore utilitarian, if it was true that FTX was consistently losing money), should have quit or leaked evidence of said fraud.
Four or five people is very different from a single bad actor, and this almost confirms for me that FTX belongs on the list of ways EA and rationalist organizations can basically go insane in harmful ways, alongside Leverage, Zizians and possibly others. It is not clear that FTX experienced a specifically EA failure mode, rather than the very common one in which power corrupts.
I was confused by this until I read more carefully. This link’s hypothesis is about people just trying to fit in―but SBF seemed not to try to fit in to his peer group! He engaged in a series of reckless and fraudulent behaviors that none of his peers seemed to want. From Going Infinite:
Example from Matt Levine:
It sounds like SBF drove away everyone who couldn’t stand his methods until only people who tolerated him were left. That’s a pretty different way of making an organization go insane.
It doesn’t seem like this shouldn’t be an EA failure mode when the EA community is working well. Word should have gotten around about SBF’s shadiness and recklessness, leading to some kind of investigation before FTX reached the point of collapse. The first person I heard making the case against SBF post-collapse was an EA (Rob Wiblin?), but we were way too slow. Of course it has been pointed out that many people who worked with / invested in FTX were fooled as well, so what I wonder about is: why weren’t there any EA whistleblowers on the inside? Edit: was it that only four people plus SBF knew about FTX’s worst behaviors, and the chance of any given person whistle-blowing in a situation like that is under 25%ish? But certainly more people than that knew he was shady. Edit 2: I just saw important details on who knew what. P.S. I will never get used to the EA/Rat tendency to downvote earnest comments, without leaving comments of their own...
You know what, I was reading Zvi’s musings on Going Infinite...
And it occurred to me that all SBF had to do was find a few people who thought like Michael Lewis, and people like that don’t seem rare. I mean, don’t like 30% of Americans think that the election was stolen from Trump, or that the cases against Trump are a witch hunt, because Trump says so and my friends all agree he’s a good guy (and they seek out pep talks to support such thoughts)? Generally the EA community isn’t tricked this easily, but SBF was smarter than Trump and he only needed to find a handful of people willing to look the other way while trusting in his Brilliance and Goodness. And since he was smart (and overconfident) and did want to do good things, he needed no grand scheme to deceive people about that. He just needed people like Lewis who lacked a gag reflex at all the bad things he was doing.
Before FTX I would’ve simply assumed other EAs had a “moral gag reflex” already. Afterward, I think we need more preaching about that (and more “punchy” ways to hammer home the importance of things like virtues, rules, reputation and conscientiousness, even or especially in utilitarianism/consequentialism). Such preaching might not have affected SBF himself (since he cut so many corners in his thinking and listening), but someone in his orbit might have needed to hear it.
(Author of the post) My model is that Sam had some initial tendencies for reckless behavior and bullet-biting, and those were then greatly exacerbated via evaporative cooling dynamics at FTX.
Relatedly, this kind of evaporative cooling is exactly the dynamic I was trying to point to in my post. Quotes:
Sorry if I sounded redundant. I’d always thought of “evaporative cooling of group beliefs” like “we start with a group with similar values/goals/beliefs; the least extreme members gradually get disengaged and leave; which cascades into a more extreme average that leads to others leaving”―very analogous to evaporation. I might’ve misunderstood, but SBF seemed to break the analogy by consistently being the most extreme, and actively and personally pushing others away (if, at times, accidentally). Edit: So… arguably one can still apply the evaporative cooling concept to FTX, but I don’t see it as an explanation of SBF himself.
What do you mean by “(Author of the post)”
I am the author of the linked post that DPiepgrass was commenting on: https://www.lesswrong.com/posts/HCAyiuZe9wz8tG6EF/my-tentative-best-guess-on-how-eas-and-rationalists
He meant that he wrote the linked post on hypotheses for how EAs and rationalists sometimes go crazy.
I thought that Sam Trabucco was not EA, but rather someone that SBF knew from math camp and MIT.
This seems right, thanks. I don’t think we have positive evidence that Trabucco was not EA, though.
Wang pled guilty to serious crimes including wire fraud, conspiracy to commit securities fraud, and conspiracy to commit commodities fraud
(can’t link plea PDF on mobile) [Edit: https://fm.cnbc.com/applications/cnbc.com/resources/editorialfiles/2022/12/21/1671676058536-Gary_Wang_Plea_Agreement.pdf ]