Here are some excerpts from Sequoia Capital’s profile on SBF (published September 2022, now pulled).
On career choice:
Not long before interning at Jane Street, SBF had a meeting with Will MacAskill, a young Oxford-educated philosopher who was then just completing his PhD. Over lunch at the Au Bon Pain outside Harvard Square, MacAskill laid out the principles of effective altruism (EA). The math, MacAskill argued, means that if one’s goal is to optimize one’s life for doing good, often most good can be done by choosing to make the most money possible—in order to give it all away. “Earn to give,” urged MacAskill.
…
It was his fellow [fraternity members] who introduced SBF to EA and then to MacAskill, who was, at that point, still virtually unknown. MacAskill was visiting MIT in search of volunteers willing to sign on to his earn-to-give program.
At a café table in Cambridge, Massachusetts, MacAskill laid out his idea as if it were a business plan: a strategic investment with a return measured in human lives. The opportunity was big, MacAskill argued, because, in the developing world, life was still unconscionably cheap. Just do the math: At $2,000 per life, a million dollars could save 500 people, a billion could save half a million, and, by extension, a trillion could theoretically save half a billion humans from a miserable death.
MacAskill couldn’t have hoped for a better recruit. Not only was SBF raised in the Bay Area as a utilitarian, but he’d already been inspired by Peter Singer to take moral action. During his freshman year, SBF went vegan and organized a campaign against factory farming. As a junior, he was wondering what to do with his life. And MacAskill—Singer’s philosophical heir—had the answer: The best way for him to maximize good in the world would be to maximize his wealth.
SBF listened, nodding, as MacAskill made his pitch. The earn-to-give logic was airtight. It was, SBF realized, applied utilitarianism. Knowing what he had to do, SBF simply said, “Yep. That makes sense.” But, right there, between a bright yellow sunshade and the crumb-strewn red-brick floor, SBF’s purpose in life was set: He was going to get filthy rich, for charity’s sake. All the rest was merely execution risk.
His course established, MacAskill gave SBF one last navigational nudge to set him on his way, suggesting that SBF get an internship at Jane Street that summer.
In 2017, everything was going great for SBF. He was killing it at Jane Street… He was giving away 50 percent of his income to his preferred charities, with the biggest donations going to the Centre for Effective Altruism and 80,000 Hours. Both charities focus on building the earn-to-give idea into a movement. (And both had been founded by Will MacAskill a few years before.) He had good friends, mostly fellow EAs. Some were even colleagues.… [much further down in the profile]
So when, that next summer, MacAskill sat with SBF in Harvard Square and carefully explained, in the way only an Oxford-educated philosopher can, that the practice of effective altruism boils down to “applied utilitarianism,” Snipe’s arrow hit SBF hard. He’d found his path. He would become a maximization engine. As he wrote in his blog, “If you’ve decided that some of your time—or money—can be better spent on others than on yourself, well, then, why not more of it? Why not all of it?”
On deciding what to do after leaving Jane Street:
SBF made a list of possible options, with some notes about each:
Journalism—low pay, but a massively outsized impact potential.
Running for office—or maybe just being an advisor?
Working for the movement—EA needs people!
Starting a startup—but what, exactly?
Bumming around the Bay Area for a month or so—just to see what happens.
On setting up the initial Japanese Bitcoin arbitrage at Alameda:
Fortunately, SBF had a secret weapon: the EA community. There’s a loose worldwide network of like-minded people who do each other favors and sleep on each other’s couches simply because they all belong to the same tribe. Perhaps the most important of them was a Japanese grad student, who volunteered to do the legwork in Japan. As a Japanese citizen, he was able to open an account with the one (obscure, rural) Japanese bank that was willing, for a fee, to process the transactions that SBF—newly incorporated as Alameda Research—wanted to make.
The spread between Bitcoin in Japan and Bitcoin in the U.S. was “only” 10 percent—but it was a trade Alameda found it could make every day. With SBF’s initial $50,000 compounding at 10 percent each day, the next step was to increase the amount of capital.
At the time, the total daily volume of crypto trading was on the order of a billion dollars. Figuring he wanted to capture 5 percent of that, SBF went looking for a $50 million loan. Again, he reached out to the EA community. Jaan Tallinn, the cofounder of Skype, put up a good chunk of that initial $50 million.
On the early days at Alameda:
The first 15 people SBF hired, all from the EA pool, were packed together in a shabby, 600-square-foot walk-up, working around the clock. The kitchen was given over to stand-up desks, the closet was reserved for sleeping, and the entire space overrun with half-eaten take-out containers. It was a royal mess. But it was also the good old days, when Alameda was just kids on a high-stakes, big-money, earn-to-give commando operation. Fifty percent of Alameda’s profits were going to EA-approved charities.
“This thing couldn’t have taken off without EA,” reminisces Singh, running his hand through a shock of thick black hair. He removes his glasses to think. They’re broken: A chopstick has been Scotch taped to one of the frame’s sides, serving as a makeshift temple. “All the employees, all the funding—everything was EA to start with.”
On how he was thinking about future earnings:
“Am I,” [reporter asks], “talking to the world’s first trillionaire?”
...
“Maybe let’s take a step back,” he says, only to launch into an explanation of his own, personal utility curve: “Which is to say, if you plot dollars-donated on the X axis, and Y is how-much-good-I-do-in-the-world, then what does that curve look like? It’s definitely not linear—it does tail off, but I think it tails off pretty slowly.”
His point seems to be that there is, out there somewhere, a diminishing return to charity. There’s a place where even effective altruism ceases to be effective. “But I think that, even at a trillion, there’s still really significant marginal utility to dollars donated.”
...
“So, is five trillion all you could ever use to help the world?”
...
“Okay, at that scale, I think the answer might be yes. Because, if your spending is on the scale of the U.S. government, it might have too weird and distortionary an impact on things.”
… so, money spent now will be more effective at making the world a better place than money spent later. “I think there are some things that are pretty urgent,” SBF says. “There’s just a long series of crucial considerations, and all of them matter a lot—and you can’t fuck any of them up, or you miss most of the total value that you could ever get.”
To be clear, SBF is not talking about maximizing the total value of FTX—he’s talking about maximizing the total value of the universe. And his units are not dollars: In a kind of GDP for the universe, his units are the units of a utilitarian. He’s maximizing utils, units of happiness. And not just for every living soul, but also every soul—human and animal—that will ever live in the future. Maximizing the total happiness of the future—that’s SBF’s ultimate goal. FTX is just a means to that end.
On what differentiates FTX in crypto:
The FTX competitive advantage? Ethical behavior. SBF is a Peter Singer–inspired utilitarian in a sea of Robert Nozick–inspired libertarians. He’s an ethical maximalist in an industry that’s overwhelmingly populated with ethical minimalists. I’m a Nozick man myself, but I know who I’d rather trust my money with: SBF, hands-down. And if he does end up saving the world as a side effect of being my banker, all the better.
On the EA community in the Bahamas that congealed around FTX:
A cocktail party is in full swing, with about a dozen people I don’t recognize standing around. It turns out to be a mixer for the local EA community that’s been drawn to Nassau in the hopes that the FTX Foundation will fund its various altruistic ideas. The point of the party is to provide a friendly forum for the EAs who actually run EA-aligned nonprofits to meet the earn-to-give EAs at FTX who will fund them, and vice versa. The irony is that, while FTX hosts the weekly mixer—providing the venue and the beverages—it’s rare for an actual FTX employee to ever show up and mix. Presumably, they’re working too hard.
...
“Imagine nerds invented a religion or something,” says Woods, stabbing at my question with vigor, “where people get to argue all day.”
“It’s… an ideology,” counters Morrison. The argument has begun.
Woods amiably disagrees: “EA is not an ideology, it’s a question: ‘How do I do the most good?’ And the cool thing about EA, compared to other cause areas, is that you can change your views constantly—and still be part of the movement.”
...
Woods serves up an answer to my question. (Fittingly, she’s wearing tennis whites.) “EA attracts people who really care, but who are also really smart,” she says. “If you are altruistic but not very smart, you just bounce off. And if you’re smart but not very altruistic,” she continues, “you can get nerd sniped!”
...
“This ties into the way FTX is doing its foundation,” Morrison says, helpfully knocking the ball back to my true interest. “The foundation wants to get a lot of money out there in order to try a lot of things quickly. And how can you do that effectively?” It’s a rhetorical question, a move worthy of a preppy debate champ who went to a certain finishing school in Cambridge—which is exactly what Morrison is. “Part of the answer is to give money to someone in the EA community.”
“Because EA is different from other communities,” Woods continues, picking up right where Morrison left off. “They’re like, ‘This is the ethical thing, and this is the truth.’ And we’re like, ‘What is the ethical thing? What is the truth?’”
Following your analogy, if a fan of Novik had:
been convinced by Novik to dedicate their career to the Novikian ethic
been pointed by Novik to a promising first job in that career path
decided to leave that promising first job on the basis of Novikian reasoning, framing the question of what to do next in Novikian terms
worked with a global network of Novikians to implement an international crypto arbitrage
received seed funding from a prominent Novikian to scale up this arbitrage
exclusively hired Novikians to continue scaling the arbitrage once it started working
thought about forward-facing professional decisions strictly in terms of the Novikian ethic
used their commitment to Novikianism to garner a professional edge in their industry
used a large portion of the proceeds of their business to fund Novikian projects, overseen by a foundation staffed exclusively by elite Novikians and advised by Novik herself
fostered a community of Novikians around their lavish corporate headquarters
… then I think it would be fair to attribute some of the impact of their actions to Novikianism.
It’s fair enough to feel betrayed in this situation, and to speak that out.
But given your position in the EA community, I think it’s much more important to put effort towards giving context on your role in this saga.
Some jumping-off points:
Did you consider yourself to be in a mentor / mentee relationship with SBF prior to the founding of FTX? What was the depth and cadence of that relationship?
e.g. from this Sequoia profile (archived as they recently pulled it from their site):
“The math, MacAskill argued, means that if one’s goal is to optimize one’s life for doing good, often most good can be done by choosing to make the most money possible—in order to give it all away. “Earn to give,” urged MacAskill.
… And MacAskill—Singer’s philosophical heir—had the answer: The best way for him to maximize good in the world would be to maximize his wealth. SBF listened, nodding, as MacAskill made his pitch. The earn-to-give logic was airtight. It was, SBF realized, applied utilitarianism. Knowing what he had to do, SBF simply said, “Yep. That makes sense.””
What diligence did you / your team do on FTX before agreeing to join the Future Fund as an advisor?
[Edited to add: Were you aware of the 2018 dispute at Alameda re: SBF’s leadership? If so, how did this context factor into your decision to join the Future Fund?]
Did you have visibility into where money earmarked for Future Fund grants was being held?
Did you understand the mechanism by which FTX claimed to be generating revenue? Were the revenues they reported sanity-checked against a back-of-the-envelope estimate of how much their claimed mechanism would be able to generate?
What were your responsibilities at the Future Fund? How often were you in contact with SBF and other members of FTX leadership in your role as an advisor?