Noting an unsubstantiated belief about the FTX disaster
There is a narrative about the FTX collapse that I have noticed emerging[1] as a commonly-held belief, despite little concrete evidence for or against it. The belief goes something like this:
Sam Bankman Fried did what he did primarily for the sake of “Effective Altruism,” as he understood it. Even though from a purely utilitarian perspective his actions were negative in expectation, he justified the fraud to himself because it was “for the greater good.” As such, poor messaging on our part[2] may be partially at fault for his downfall.
This take may be more or less plausible, but it is also unsubstantiated. As Astrid Wilde noted on Twitter, there is a distinct possibility that the causality of the situation may have run the other way, with SBF as a conman taking advantage of the EA community’s high-trust environment to boost himself.[3] Alternatively (or additionally), it also seems quite plausible to me that the downfall of FTX had something to do with the social dynamics of the company, much as Enron’s downfall can be traced back to [insert your favorite theory for why Enron collapsed here]. We do not, and to some degree cannot, know what SBF’s internal monologue has been, and if we are to update our actions responsibly in order to avoid future mistakes of this magnitude (which we absolutely should do), we must deal with the facts as they most likely are, not as we would like or fear them to be.
All of this said, I strongly suspect[4] that in ten years from now, conventional wisdom will hold the above belief as being basically cannon, regardless of further evidence in either direction. This is because it presents an intrinsically interesting, almost Hollywood villain-esque narrative, one that will surely evoke endless “hot takes” which journalists, bloggers, etc. will have a hard time passing over. Expect this to become the default understanding of what happened (from outsiders at least), and prepare accordingly. At the same time, be cautious when updating your internal beliefs so as not to assume automatically that this story must be the truth of the matter. We need to carefully examine where our focus in self-improvement should lie moving forward, and it may not be the case that a revamping of our internal messaging is necessary (though it may very well be in the end; I certainly do not feel qualified to make that final call, only to point out what I recognize from experience as a temptingly powerful story beat which may influence it).
- ^
Primarily on the Effective Altruism forum, but also on Twitter.
- ^
See e.g. “pro fanaticism” messaging from some community factions, though it should be noted that this has always been a minority position.
- ^
EDIT: Some in the comments have pointed out that as SBF has been involved with EA since pretty much forever, it’s unlikely that he was sociopathically taking advantage of the community, and therefore we should not morally absolve ourselves. To this I have two primary responses: A) This may be the case, but it do not mistake this objection as defeating the main point, which is that EA ideology was not necessarily the cause of this aspect of his life. We should definitely be introspective in considering how to prevent this in the future, but we should also not beat ourselves up unnecessarily if doing so would be counterproductive. B) It is unclear how deeply he actually believed in EA ideals, and how much of his public persona has been an act—anecdotes (
andmemes like this one, which I am unsure how much weight to put on it as evidence; probably fairly little) suggest the latter, though as someone who’s never met him personally it’s hard to say. - ^
With roughly 80% confidence, conditional on 1.) No obviously true alternative story coming out about FTX that totally accounts for all their misdeeds somehow, and 2.) This post (or one containing the same observation), not becoming widely cited (since feedback loops can get complex and I don’t want to bother accounting for that).
- EA & LW Forums Weekly Summary (7th Nov − 13th Nov 22′) by 16 Nov 2022 3:04 UTC; 38 points) (
- EA & LW Forums Weekly Summary (14th Nov − 27th Nov 22′) by 29 Nov 2022 22:59 UTC; 22 points) (
- EA & LW Forums Weekly Summary (14th Nov − 27th Nov 22′) by 29 Nov 2022 23:00 UTC; 21 points) (LessWrong;
- EA & LW Forums Weekly Summary (7th Nov − 13th Nov 22′) by 16 Nov 2022 3:04 UTC; 19 points) (LessWrong;
What makes this implausible for me is that SBF has been involved in EA since very early one (~2013 or earlier?). Back then, there was no money, power or fame to speak of, so why join this fringe movement?
To be clear, I think there were multiple causal factors, and believing in EA probably explained a lot less variance than SBF’s idiosyncratic character traits (e.g., plausibly dark triad traits, especially narcissism and Machiavellianism which entail immense lust for power and fame, disregard of common-sense morality such as not lying, etc.). I mean, there are like 10,000 EAs and I don’t know of anyone who has committed a serious crime because of EA.
A hypothetical human with the same personality traits as SBF but who doesn’t believe in EA, plausibly would have done pretty shady things as well. Unfortunately, it is possible that EA motivated SBF to amass even more power and money than otherwise. Also, EA provided SBF with a network of highly competent, motivated and trusting people.
I think conceiving of SBF as someone who totally did not believe in EA principles and did everything just for money and power is simplistic and false.
We will obviously never know the precise contents of SBF’s internal monologue. But it is conceivable that he thought he is doing everything only because of EA. Everyone is the hero of their own story.
But you cannot blindly trust your internal monologue. SBFs actions were probably shaped to a large extent by subconscious motivations. My best guess is that SBF might have been somewhat aware that he has ulterior motives but that he thought this is okay and that he is it all under control (“no one is perfect” “I’ll use my desires for power as additional motivational fuel”, etc.). Though it’s also possible that he thought of himself as practically a saint.
In my experience a disproportionate numbers of EAs have dark triad traits, which is not surprising given that a disproportionate number of EAs are mainly in touch with their heads.
Also wasn’t SBF’s plan to become a maths professor, and he changed his mind because of ETG arguments?
Copying my LW comment:
I don’t buy this argument for a few reasons:
SBF met Will MacAskill in 2013 and it was following that discussion that SBF decided to earn to give
EA wasn’t a powerful or influential movement back in 2013, but quite a fringe cause.
SBF was in EA since his college days, long before his career in quantitative finance and later in crypto
SBF didn’t latch onto EA after he acquired some measure of power or when EA was a force to be reckoned with, but pretty early on. He was in a sense “homegrown” within EA.
The “SBF was a sociopath using EA to launder his reputation” is just motivated credulity IMO. There is little evidence in favour of it. It’s just something that sounds good to be true and absolves us of responsibility.
Astrid’s hypothesis is very uncredible when you consider that she doesn’t seem to be aware of SBF’s history within EA. Like what’s the angle here? There’s nothing suggesting SBF planned to enter finance as a college student before MacAskill sold him on earning to give.
This is a fair critique imo, I’m updating against SBF using EA for sociopathic reasons. That being said, only slightly updating towards him using EA ideology as his main motivator to commit fraud, as that still may very well not be the case.
For the record, I did not believe this to be the case (and had extensively argued as so on Twitter). Even a naive utilitarian calculus doesn’t justify risking the funds of over a million customers, the FTX Future fund, the reputation and public goodwill of the EA community, and the entirety of FTX itself to try and bail out Alameda (if that is indeed what happened).
That said, an EA in crypto I trust has told me that if Alameda went under, FTX would have gone down with it, and so it may have been a case of “lose literally everything” or gamble customer funds to try and save Alameda (and by extension FTX).
If Alameda’s bad bets was going to drag FTX under if SBF let it fail, then it’s possible that the trade was:
Lose Alameda Research, FTX (and its customer funds), the FTX Future Foundation and all future donations
Gamble customer funds to try and save all the above (if you win) vs lose them and the reputation and public goodwill of the Effective Altruism community if you lose
Then the utilitarian calculus is very different. I’m not trying to argue that SBF committed fraud due to EA ideology, but it’s no longer as implausible as it seemed to me in the first place.
At least it may not be the case that SBF had the option to just let Alameda go under and keep FTX/its customer funds. It’s not clear that the funds of over a million customers would have been preserved even if FTX did not gamble them.
(The above argument is speculative and based on second hand explanations of crypto dynamics I don’t understand very well; it may be completely wrong.)
I don’t see how losing Alamada could have lost the depositor funds, at least if there had been no gambling with depositor funds to that point. I can see, however, how it could crash the FTT token, and that could bring down FTX as a business. But the deposits should have been safe, and the ones not denominated in FTT should have held value. So I don’t think the “nothing to lose” scenario is likely.
Fwiw, I don’t think a crash in the FTT token would’ve crashed FTX as a business (assuming no funny business with extending loans to other parties collateralized in FTT). Afaik FTT was basically a revenue-share token, essentially like common stock.
Just as Meta shares falling 70% didn’t affect their core business of showing users ads, a crash in FTT shouldn’t have affected the core exchange business. It’s just the going rate for rights to future profits.
(crossposted from LW) -- I think the [fact that SBF is a vegan and also the way he lives is very strong evidence for the narrative being true. It’s the kind of thing many people will dismiss because it’s in the “personal life choice” category and perhaps sounds judgy, but the proper Bayesian update here is significant.
Also worth noting here is that, as expected, EA’s have in general condemned this idea and SBF has gone against the standard wisdom of EA in doing this. I feel like EA’s principles were broken, not followed, even though I agree SBF was almost certainly a committed effective altruist. The update, for me, is not “EA as an ideology is rotten when taken very seriously” but rather “EA’s are, despite our commitments to ethical behaviour, perhaps no more trustworthy with power than anyone else.”
This has caused me to pretty sharply reduce my probability of EA politicians being a good idea, but hasn’t caused a significant update against the core principles of EA.
I wonder if “perhaps no more trustworthy with power than anyone else” goes a little too far. I think the EA community made mistakes that facilitated FTX misbehavior, but that is only one small group of people. Many EAs have substantial power in the world and have continued to be largely trustworthy (and thus less newsworthy!), and I think we have evidence like our stronger-than-average explicit commitments to use power for good and the critical reflection happening in the community right now suggests we are probably doing better than average—even though, as you rightly point out, we’re far from perfect.
Fair point. I think, in a knee-jerk reaction, I adjusted too far here. At the very least, it seems that EA’s are at least somewhat more likely to do good with power if they have that aim rather than people who just want power for power’s sake. It’s still an adjustment downwards on my part for the EV of EA politicians, but not to 0 compared to the median candidate of said candidate’s political party.
In Bayesian terms the update should be in the direction of EAs being less trustworthy than the average person, if you agree that the average CEO of a firm like FTX wouldn’t have done what SBF did.
I’ll be honest, I’ve been putting judgement based on his (apparent) lifestyle on hold, as I’ve seen some anecdotes/memes floating around twitter suggesting that he may not have been honest about his veganism/other lifestyle choices. I don’t know enough about that situation to distinguish the actual truth of the matter, so it’s possible I’ve been subject to misinformation there (also I scrolled past it quickly on Twitter and it’s possible it was some meta-ironic meme or something). If there is (legitimate) evidence he was actually faking it, that would make me update strongly in the other direction, of course.
If there is solid evidence that he was lying about being vegan, I’ll change my position completely. That’d be a much worse sign than just not being vegan in the first place. (But as you say, [the fact that some people on twitter hinted at] isn’t convincing.)
It’s possibly worth noting that in his conversation with Tyler Cowen he did mention that he had previously lied (briefly) about being vegetarian.
I dug up that conversation, and the point you’re referring to is presumably here. The story he tells is: he’s trying but failing to be a vegatarian, gets asked by another vegetarian in a social setting whether he is one, says he is, and then never eats meat ever again.
I know I’m contradicting what I just said since it is technically a lie, but honesty this doesn’t seem like a big deal to me. ImE you can genuinely lie “without wanting to” in social situations. Someone asks you a question, and some unconscious process in your brain produces an answer within a second before “you” really get to have a say on it. This happened to me several times. And I can understand why it happened here since he was trying to be a vegetarian.
Thanks for giving the details, I couldn’t quite remember the full story and should’ve looked it up and quoted directly. I don’t quite know what to make of him doing this—on the one hand, a small lie about being vegetarian doesn’t seem particularly pernicious or noteworthy, especially given he went vegetarian after lying about it. On the other hand, it does at least strike me as somewhat odd to do this if he had just eaten a cheeseburger a few hours earlier. It does update me ever-so-slightly towards thinking that he’s liable to lie if it makes him look good—it might not just be a ‘lie without intending to’ situation.
+1 to this, can attest I’ve done the same, and immediately regretted it lol
Curious what you think about screenshots like this one, which I’ve now seen in a few different places.
It’s lowered my confidence. Though it could have various mundane explanations, the simplest one being that it was taken before he became a vegan, and if so I feel bad about speculating.
I don’t think I would update at all based just on those memes—particularly as my understanding is that he lived in group houses! (I know a lot of other EAs are vegan, but not everyone is)
It’s not a meme, it’s actually a screenshot from a video. It seems more likely to me, though, that he is vegan and they were just using a fridge that other people use as well, because if he weren’t vegan and this is his fridge, they probably would just have removed the eggs.
I would call the way it has been posted on Twitter a meme, and the main point was on how much to update not on the format (meme or video) this information was presented in! For which I think we are in agreement
this seems probable to me, thanks for sharing a good-faith explanation
Yeah, that would be another mundane explanation!
To the extent that any EA beliefs likely contributed to FTX’s collapse, I suspect that they are mostly related to the fact that typical EA risk attitudes, while normally correct, transfer poorly to the financial sector under human cognitive constraints. Specifically, I think that the finance industry is a special case where the recommendation to be more risk-seeking is wrong. This is because in most areas (e.g., media, innovation, charitable impact) the distribution of outcomes is right-tailed, but in finance it is left-tailed. As there is robust evidence that humans overweight the outcomes of recent events when forming their expectations, this will cause someone trying to optimise for impact in a risk-neutral way (as SBF seemingly tried to) to take on excessive risk in finance and not enough in other fields. This could be especially dangerous if SBF had internalised the meme that we should be more risk-seeking due to the distribution of outcomes being right-tailed. If my hypothesis is correct, it implies that it is especially important to know your risk environment when making decisions but there may not be many implications for most EA activities, as FTX operated in an atypical risk environment.
A relevant Twitter thread by Dustin Moskovitz:
I just wanted to note that I agree with everything you’ve said here.
Edit: This is kinda whatever but I guess I’m getting downvoted for agreeing here? There is no agree vote for the OP, so the only way I can differentiate between upvoting because I thought it was high quality vs actually agreeing with the post is by commenting...
I agree with you.
However, the alternative isn’t any better for fostering the EA Community:
Given that SBF is a fraud, that
Makes anybody who took his money either a criminal, or a fool.
It’s only natural that whichever EA public figure that took his money tries to avoid the “fool” label, but… that’s what conmen do: They fool people.
Makes anybody who gave him money, a fool.
This is really damning, because people might begin associating the EA community and the EA causes themselves to fraud.
In a sense, it’s better for the EA causes for people to believe that the SBF fraud was propelled by a misguided fanaticism of SBF himself.
Being fooled and being a fool are two different things with two different meanings. Being misled by someone who is skilled at and highly incentivized to mislead people doesn’t make someone definitionally naive or bad at reasoning.
It certainly isn’t a good outcome for EA either way, and I don’t want us prematurely absolving ourselves of any responsibility we may end up holding. I just want to be as clear-thinking about this as possible, so we can best mend ourselves moving forward.
Nice.