Noting an unsubstantiated belief about the FTX disaster

There is a narrative about the FTX collapse that I have noticed emerging[1] as a commonly-held belief, despite little concrete evidence for or against it. The belief goes something like this:

  • Sam Bankman Fried did what he did primarily for the sake of “Effective Altruism,” as he understood it. Even though from a purely utilitarian perspective his actions were negative in expectation, he justified the fraud to himself because it was “for the greater good.” As such, poor messaging on our part[2] may be partially at fault for his downfall.

This take may be more or less plausible, but it is also unsubstantiated. As Astrid Wilde noted on Twitter, there is a distinct possibility that the causality of the situation may have run the other way, with SBF as a conman taking advantage of the EA community’s high-trust environment to boost himself.[3] Alternatively (or additionally), it also seems quite plausible to me that the downfall of FTX had something to do with the social dynamics of the company, much as Enron’s downfall can be traced back to [insert your favorite theory for why Enron collapsed here]. We do not, and to some degree cannot, know what SBF’s internal monologue has been, and if we are to update our actions responsibly in order to avoid future mistakes of this magnitude (which we absolutely should do), we must deal with the facts as they most likely are, not as we would like or fear them to be.

All of this said, I strongly suspect[4] that in ten years from now, conventional wisdom will hold the above belief as being basically cannon, regardless of further evidence in either direction. This is because it presents an intrinsically interesting, almost Hollywood villain-esque narrative, one that will surely evoke endless “hot takes” which journalists, bloggers, etc. will have a hard time passing over. Expect this to become the default understanding of what happened (from outsiders at least), and prepare accordingly. At the same time, be cautious when updating your internal beliefs so as not to assume automatically that this story must be the truth of the matter. We need to carefully examine where our focus in self-improvement should lie moving forward, and it may not be the case that a revamping of our internal messaging is necessary (though it may very well be in the end; I certainly do not feel qualified to make that final call, only to point out what I recognize from experience as a temptingly powerful story beat which may influence it).

  1. ^

    Primarily on the Effective Altruism forum, but also on Twitter.

  2. ^

    See e.g. “pro fanaticism” messaging from some community factions, though it should be noted that this has always been a minority position.

  3. ^

    EDIT: Some in the comments have pointed out that as SBF has been involved with EA since pretty much forever, it’s unlikely that he was sociopathically taking advantage of the community, and therefore we should not morally absolve ourselves. To this I have two primary responses: A) This may be the case, but it do not mistake this objection as defeating the main point, which is that EA ideology was not necessarily the cause of this aspect of his life. We should definitely be introspective in considering how to prevent this in the future, but we should also not beat ourselves up unnecessarily if doing so would be counterproductive. B) It is unclear how deeply he actually believed in EA ideals, and how much of his public persona has been an act—anecdotes (and memes like this one, which I am unsure how much weight to put on it as evidence; probably fairly little) suggest the latter, though as someone who’s never met him personally it’s hard to say.

  4. ^

    With roughly 80% confidence, conditional on 1.) No obviously true alternative story coming out about FTX that totally accounts for all their misdeeds somehow, and 2.) This post (or one containing the same observation), not becoming widely cited (since feedback loops can get complex and I don’t want to bother accounting for that).

Crossposted to LessWrong (50 points, 52 comments)