Not long before interning at Jane Street, SBF had a meeting with Will MacAskill, a young Oxford-educated philosopher who was then just completing his PhD. Over lunch at the Au Bon Pain outside Harvard Square, MacAskill laid out the principles of effective altruism (EA). The math, MacAskill argued, means that if one’s goal is to optimize one’s life for doing good, often most good can be done by choosing to make the most money possible—in order to give it all away. “Earn to give,” urged MacAskill.
…
It was his fellow [fraternity members] who introduced SBF to EA and then to MacAskill, who was, at that point, still virtually unknown. MacAskill was visiting MIT in search of volunteers willing to sign on to his earn-to-give program.
At a café table in Cambridge, Massachusetts, MacAskill laid out his idea as if it were a business plan: a strategic investment with a return measured in human lives. The opportunity was big, MacAskill argued, because, in the developing world, life was still unconscionably cheap. Just do the math: At $2,000 per life, a million dollars could save 500 people, a billion could save half a million, and, by extension, a trillion could theoretically save half a billion humans from a miserable death.
MacAskill couldn’t have hoped for a better recruit. Not only was SBF raised in the Bay Area as a utilitarian, but he’d already been inspired by Peter Singer to take moral action. During his freshman year, SBF went vegan and organized a campaign against factory farming. As a junior, he was wondering what to do with his life. And MacAskill—Singer’s philosophical heir—had the answer: The best way for him to maximize good in the world would be to maximize his wealth.
SBF listened, nodding, as MacAskill made his pitch. The earn-to-give logic was airtight. It was, SBF realized, applied utilitarianism. Knowing what he had to do, SBF simply said, “Yep. That makes sense.” But, right there, between a bright yellow sunshade and the crumb-strewn red-brick floor, SBF’s purpose in life was set: He was going to get filthy rich, for charity’s sake. All the rest was merely execution risk.
His course established, MacAskill gave SBF one last navigational nudge to set him on his way, suggesting that SBF get an internship at Jane Street that summer.
In 2017, everything was going great for SBF. He was killing it at Jane Street… He was giving away 50 percent of his income to his preferred charities, with the biggest donations going to the Centre for Effective Altruism and 80,000 Hours. Both charities focus on building the earn-to-give idea into a movement. (And both had been founded by Will MacAskill a few years before.) He had good friends, mostly fellow EAs. Some were even colleagues.
… [much further down in the profile]
So when, that next summer, MacAskill sat with SBF in Harvard Square and carefully explained, in the way only an Oxford-educated philosopher can, that the practice of effective altruism boils down to “applied utilitarianism,” Snipe’s arrow hit SBF hard. He’d found his path. He would become a maximization engine. As he wrote in his blog, “If you’ve decided that some of your time—or money—can be better spent on others than on yourself, well, then, why not more of it? Why not all of it?”
On deciding what to do after leaving Jane Street:
SBF made a list of possible options, with some notes about each:
Journalism—low pay, but a massively outsized impact potential.
Running for office—or maybe just being an advisor?
Working for the movement—EA needs people!
Starting a startup—but what, exactly?
Bumming around the Bay Area for a month or so—just to see what happens.
On setting up the initial Japanese Bitcoin arbitrage at Alameda:
Fortunately, SBF had a secret weapon: the EA community. There’s a loose worldwide network of like-minded people who do each other favors and sleep on each other’s couches simply because they all belong to the same tribe. Perhaps the most important of them was a Japanese grad student, who volunteered to do the legwork in Japan. As a Japanese citizen, he was able to open an account with the one (obscure, rural) Japanese bank that was willing, for a fee, to process the transactions that SBF—newly incorporated as Alameda Research—wanted to make.
The spread between Bitcoin in Japan and Bitcoin in the U.S. was “only” 10 percent—but it was a trade Alameda found it could make every day. With SBF’s initial $50,000 compounding at 10 percent each day, the next step was to increase the amount of capital.
At the time, the total daily volume of crypto trading was on the order of a billion dollars. Figuring he wanted to capture 5 percent of that, SBF went looking for a $50 million loan. Again, he reached out to the EA community. Jaan Tallinn, the cofounder of Skype, put up a good chunk of that initial $50 million.
On the early days at Alameda:
The first 15 people SBF hired, all from the EA pool, were packed together in a shabby, 600-square-foot walk-up, working around the clock. The kitchen was given over to stand-up desks, the closet was reserved for sleeping, and the entire space overrun with half-eaten take-out containers. It was a royal mess. But it was also the good old days, when Alameda was just kids on a high-stakes, big-money, earn-to-give commando operation. Fifty percent of Alameda’s profits were going to EA-approved charities.
“This thing couldn’t have taken off without EA,” reminisces Singh, running his hand through a shock of thick black hair. He removes his glasses to think. They’re broken: A chopstick has been Scotch taped to one of the frame’s sides, serving as a makeshift temple. “All the employees, all the funding—everything was EA to start with.”
On how he was thinking about future earnings:
“Am I,” [reporter asks], “talking to the world’s first trillionaire?”
...
“Maybe let’s take a step back,” he says, only to launch into an explanation of his own, personal utility curve: “Which is to say, if you plot dollars-donated on the X axis, and Y is how-much-good-I-do-in-the-world, then what does that curve look like? It’s definitely not linear—it does tail off, but I think it tails off pretty slowly.”
His point seems to be that there is, out there somewhere, a diminishing return to charity. There’s a place where even effective altruism ceases to be effective. “But I think that, even at a trillion, there’s still really significant marginal utility to dollars donated.”
...
“So, is five trillion all you could ever use to help the world?”
...
“Okay, at that scale, I think the answer might be yes. Because, if your spending is on the scale of the U.S. government, it might have too weird and distortionary an impact on things.”
… so, money spent now will be more effective at making the world a better place than money spent later. “I think there are some things that are pretty urgent,” SBF says. “There’s just a long series of crucial considerations, and all of them matter a lot—and you can’t fuck any of them up, or you miss most of the total value that you could ever get.”
To be clear, SBF is not talking about maximizing the total value of FTX—he’s talking about maximizing the total value of the universe. And his units are not dollars: In a kind of GDP for the universe, his units are the units of a utilitarian. He’s maximizing utils, units of happiness. And not just for every living soul, but also every soul—human and animal—that will ever live in the future. Maximizing the total happiness of the future—that’s SBF’s ultimate goal. FTX is just a means to that end.
On what differentiates FTX in crypto:
The FTX competitive advantage? Ethical behavior. SBF is a Peter Singer–inspired utilitarian in a sea of Robert Nozick–inspired libertarians. He’s an ethical maximalist in an industry that’s overwhelmingly populated with ethical minimalists. I’m a Nozick man myself, but I know who I’d rather trust my money with: SBF, hands-down. And if he does end up saving the world as a side effect of being my banker, all the better.
On the EA community in the Bahamas that congealed around FTX:
A cocktail party is in full swing, with about a dozen people I don’t recognize standing around. It turns out to be a mixer for the local EA community that’s been drawn to Nassau in the hopes that the FTX Foundation will fund its various altruistic ideas. The point of the party is to provide a friendly forum for the EAs who actually run EA-aligned nonprofits to meet the earn-to-give EAs at FTX who will fund them, and vice versa. The irony is that, while FTX hosts the weekly mixer—providing the venue and the beverages—it’s rare for an actual FTX employee to ever show up and mix. Presumably, they’re working too hard.
...
“Imagine nerds invented a religion or something,” says Woods, stabbing at my question with vigor, “where people get to argue all day.”
“It’s… an ideology,” counters Morrison. The argument has begun.
Woods amiably disagrees: “EA is not an ideology, it’s a question: ‘How do I do the most good?’ And the cool thing about EA, compared to other cause areas, is that you can change your views constantly—and still be part of the movement.”
...
Woods serves up an answer to my question. (Fittingly, she’s wearing tennis whites.) “EA attracts people who really care, but who are also really smart,” she says. “If you are altruistic but not very smart, you just bounce off. And if you’re smart but not very altruistic,” she continues, “you can get nerd sniped!”
...
“This ties into the way FTX is doing its foundation,” Morrison says, helpfully knocking the ball back to my true interest. “The foundation wants to get a lot of money out there in order to try a lot of things quickly. And how can you do that effectively?” It’s a rhetorical question, a move worthy of a preppy debate champ who went to a certain finishing school in Cambridge—which is exactly what Morrison is. “Part of the answer is to give money to someone in the EA community.”
“Because EA is different from other communities,” Woods continues, picking up right where Morrison left off. “They’re like, ‘This is the ethical thing, and this is the truth.’ And we’re like, ‘What is the ethical thing? What is the truth?’”
Following your analogy, if a fan of Novik had:
been convinced by Novik to dedicate their career to the Novikian ethic
been pointed by Novik to a promising first job in that career path
decided to leave that promising first job on the basis of Novikian reasoning, framing the question of what to do next in Novikian terms
worked with a global network of Novikians to implement an international crypto arbitrage
received seed funding from a prominent Novikian to scale up this arbitrage
exclusively hired Novikians to continue scaling the arbitrage once it started working
thought about forward-facing professional decisions strictly in terms of the Novikian ethic
used their commitment to Novikianism to garner a professional edge in their industry
used a large portion of the proceeds of their business to fund Novikian projects, overseen by a foundation staffed exclusively by elite Novikians and advised by Novik herself
fostered a community of Novikians around their lavish corporate headquarters
… then I think it would be fair to attribute some of the impact of their actions to Novikianism.
I ‘volunteered’ in the sense that people at Alameda reached out to me, I said ok and then got paid by the hour for my help.
‘(obscure, rural)’ is an exaggeration. ‘provincial’ would be a more apt adjective for the location. The main bank we used was SMBC, the second-largest bank in Japan.
‘for a fee’ sounds as if it was some sort of bribe to get them to do what we wanted. But we only paid the usual transaction fees and margin that any bank would charge.
Definitely: you are obviously right and Eliezer obviously wrong about this, imho.
BUT
I do think it is hindsight bias to some degree to think that “EA” as a collective or Will MacAskill as an individual are recorded as doing something wrong, in the sense of “predictably a bad idea” at any point in the passages you quote. (I know you didn’t actually claim that!) It’s not immoral to tell some to found a business, so it’s definitely not immoral to tell someone to found a business and give to charity. It’s not immoral to help someone make a legal, non-scammy trade, as the anonymous Japanese EA apparently did (“buy low and sell high” is not poor business ethics as far as I know, though I’m prepared to be corrected about that by someone who actually knows finance.) It’s a bit more controversial to say it’s not wrong to take very rich people’s money to do the sort of work EA charities do, but it’s certainly not obvious that it is, and nothing in the quoted passages actually shows that any individual had evidence that FTX were a bad org to be associated with. (They may well have, I’m not saying no one did wrong, I’m just saying no wrong-doing is suggested by the information quoted here.) Furthermore “take money from rich people for philanthropy and speculative academic research” isn’t exactly a uniquely EA practice!
That leaves suggesting FTX think in utilitarian terms about maximizing, but I think it is obviously a complicated question whether that was a knowably bad idea when it was done, and depends on the details of how it was done.
Of course, there may well have been wrong-doing at some point, but we need proper investigation before we decided. And furthermore, we can’t just assume that any wrongdoing, even severe wrongdoing, that did occur would have saved the depositors SBF stole from, who are the main victims of this whole mess. My guess is that once the early decision to encourage SBF to found Alameda was made by Will, and SBF received some early help from the community, withdrawing our support later would not have done very much to prevent FTX from becoming a successful business that stole from its customers. But those early decisions are probably the least morally suspicious, in that they were taken early when there was the least available information about the business ethics of SBF and FTX/Alameda available. To repeat: I don’t think telling someone to found a business to earn to give, or helping out a business make a legal, non-scammy trade, is itself immoral. (Again, I’m assuming the trade was legal and non-scammy, but very willing to be corrected!). The suspicious decisions that might have been decisive was maybe “get SBF and other FTX/Alameda high-ups to think in a utilitarian way’. But as I say, I don’t think its reasonable to hold that was clearly wrong at the time.
I’m more interested in reflecting on the foundational issues in EA-style thinking that contributed to the FTX debacle than in abscribing wrongdoing or immorality (though I agree that the whole episode should be thoroughly investigated).
Following a utilitarian logic, FTX/Alameda pursued a high-leverage strategy (Caroline on leverage); the decision to pursue this strategy didn’t account for the massive externalities that resulted from its failure
EA’s inability and/or unwillingness to vet FTX’s operations (lack of financial controls, lack of board oversight, no ring-fence around funds committed to the Future Fund) and SBF’s history of questionable leadership points to overeager power-seeking
EA leadership’s stance of minimal communication about their roles in the debacle points to a high weight placed on optics / face-saving (Holden’s post and Oli’s commenting are refreshing counterexamples though I think it’s important to hear more about their involvement at some point too)
I agree with Eliezer that a lot of EAs are over-blaming EA for the FTX implosion, based on the facts currently known. But the Scholomance case is obviously a lot weaker than the EA case in real life, and this is a great summary of why.
The point is not “EA did as little to shape Alameda as Novik did to shape Alameda” but “here is an example of the mental motion of trying to grab too much responsibility for yourself”.
Here are some excerpts from Sequoia Capital’s profile on SBF (published September 2022, now pulled).
On career choice:
On deciding what to do after leaving Jane Street:
On setting up the initial Japanese Bitcoin arbitrage at Alameda:
On the early days at Alameda:
On how he was thinking about future earnings:
On what differentiates FTX in crypto:
On the EA community in the Bahamas that congealed around FTX:
Following your analogy, if a fan of Novik had:
been convinced by Novik to dedicate their career to the Novikian ethic
been pointed by Novik to a promising first job in that career path
decided to leave that promising first job on the basis of Novikian reasoning, framing the question of what to do next in Novikian terms
worked with a global network of Novikians to implement an international crypto arbitrage
received seed funding from a prominent Novikian to scale up this arbitrage
exclusively hired Novikians to continue scaling the arbitrage once it started working
thought about forward-facing professional decisions strictly in terms of the Novikian ethic
used their commitment to Novikianism to garner a professional edge in their industry
used a large portion of the proceeds of their business to fund Novikian projects, overseen by a foundation staffed exclusively by elite Novikians and advised by Novik herself
fostered a community of Novikians around their lavish corporate headquarters
… then I think it would be fair to attribute some of the impact of their actions to Novikianism.
Some corrections of the Sequoia info:
I’ve never been a grad student.
I’m neither Japanese nor a Japanese citizen.
I ‘volunteered’ in the sense that people at Alameda reached out to me, I said ok and then got paid by the hour for my help.
‘(obscure, rural)’ is an exaggeration. ‘provincial’ would be a more apt adjective for the location. The main bank we used was SMBC, the second-largest bank in Japan.
‘for a fee’ sounds as if it was some sort of bribe to get them to do what we wanted. But we only paid the usual transaction fees and margin that any bank would charge.
But mostly, if https://forum.effectivealtruism.org/posts/xafpj3on76uRDoBja/the-ftx-future-fund-team-has-resigned-1?commentId=hpP8EjEt9zTmWKFRy is accurate, I’m bummed that the money I helped earn was squandered right away.
Definitely: you are obviously right and Eliezer obviously wrong about this, imho.
BUT
I do think it is hindsight bias to some degree to think that “EA” as a collective or Will MacAskill as an individual are recorded as doing something wrong, in the sense of “predictably a bad idea” at any point in the passages you quote. (I know you didn’t actually claim that!) It’s not immoral to tell some to found a business, so it’s definitely not immoral to tell someone to found a business and give to charity. It’s not immoral to help someone make a legal, non-scammy trade, as the anonymous Japanese EA apparently did (“buy low and sell high” is not poor business ethics as far as I know, though I’m prepared to be corrected about that by someone who actually knows finance.) It’s a bit more controversial to say it’s not wrong to take very rich people’s money to do the sort of work EA charities do, but it’s certainly not obvious that it is, and nothing in the quoted passages actually shows that any individual had evidence that FTX were a bad org to be associated with. (They may well have, I’m not saying no one did wrong, I’m just saying no wrong-doing is suggested by the information quoted here.) Furthermore “take money from rich people for philanthropy and speculative academic research” isn’t exactly a uniquely EA practice!
That leaves suggesting FTX think in utilitarian terms about maximizing, but I think it is obviously a complicated question whether that was a knowably bad idea when it was done, and depends on the details of how it was done.
Of course, there may well have been wrong-doing at some point, but we need proper investigation before we decided. And furthermore, we can’t just assume that any wrongdoing, even severe wrongdoing, that did occur would have saved the depositors SBF stole from, who are the main victims of this whole mess. My guess is that once the early decision to encourage SBF to found Alameda was made by Will, and SBF received some early help from the community, withdrawing our support later would not have done very much to prevent FTX from becoming a successful business that stole from its customers. But those early decisions are probably the least morally suspicious, in that they were taken early when there was the least available information about the business ethics of SBF and FTX/Alameda available. To repeat: I don’t think telling someone to found a business to earn to give, or helping out a business make a legal, non-scammy trade, is itself immoral. (Again, I’m assuming the trade was legal and non-scammy, but very willing to be corrected!). The suspicious decisions that might have been decisive was maybe “get SBF and other FTX/Alameda high-ups to think in a utilitarian way’. But as I say, I don’t think its reasonable to hold that was clearly wrong at the time.
Thanks for this comment.
I’m more interested in reflecting on the foundational issues in EA-style thinking that contributed to the FTX debacle than in abscribing wrongdoing or immorality (though I agree that the whole episode should be thoroughly investigated).
Examples of foundational issues:
FTX was an explicitly maximalist project, and maximization is perilous
Following a utilitarian logic, FTX/Alameda pursued a high-leverage strategy (Caroline on leverage); the decision to pursue this strategy didn’t account for the massive externalities that resulted from its failure
The Future Fund failed to identify an existential risk to its own operation, which casts doubt on their/our ability to perform risk assessment
EA’s inability and/or unwillingness to vet FTX’s operations (lack of financial controls, lack of board oversight, no ring-fence around funds committed to the Future Fund) and SBF’s history of questionable leadership points to overeager power-seeking
MacAskill’s attempt to broker an SBF <> Elon deal re: purchasing Twitter also points to overeager power-seeking
Consequentialism straightforwardly implies that the ends justify the means at least sometimes; protesting that the ends don’t justify the means is cognitive dissonance
EA leadership’s stance of minimal communication about their roles in the debacle points to a high weight placed on optics / face-saving (Holden’s post and Oli’s commenting are refreshing counterexamples though I think it’s important to hear more about their involvement at some point too)
Sounds right to me!
I agree with Eliezer that a lot of EAs are over-blaming EA for the FTX implosion, based on the facts currently known. But the Scholomance case is obviously a lot weaker than the EA case in real life, and this is a great summary of why.
The point is not “EA did as little to shape Alameda as Novik did to shape Alameda” but “here is an example of the mental motion of trying to grab too much responsibility for yourself”.
Fair!