Sam Bankman-Fried: If your goal is to have impact on the world — and in particular if your goal is to maximize the amount of impact that you have on the world — that has pretty strong implications for what you end up doing. Among other things, if you really are trying to maximize your impact, then at what point do you start hitting decreasing marginal returns? Well, in terms of doing good, there’s no such thing: more good is more good. It’s not like you did some good, so good doesn’t matter anymore. But how about money? Are you able to donate so much that money doesn’t matter anymore? And the answer is, I don’t exactly know. But you’re thinking about the scale of the world there, right? At what point are you out of ways for the world to spend money to change?
Sam Bankman-Fried: There’s eight billion people. Government budgets run in the tens of trillions per year. It’s a really massive scale. You take one disease, and that’s a billion a year to help mitigate the effects of one tropical disease. So it’s unclear exactly what the answer is, but it’s at least billions per year probably, so at least 100 billion overall before you risk running out of good things to do with money. I think that’s actually a really powerful fact. That means that you should be pretty aggressive with what you’re doing, and really trying to hit home runs rather than just have some impact — because the upside is just absolutely enormous.
Rob Wiblin: Yeah. Our instincts about how much risk to take on are trained on the fact that in day-to-day life, the upside for us as individuals is super limited. Even if you become a millionaire, there’s just only so much incrementally better that your life is going to be — and getting wiped out is very bad by contrast.
Rob Wiblin: But when it comes to doing good, you don’t hit declining returns like that at all. Or not really on the scale of the amount of money that any one person can make. So you kind of want to just be risk neutral. As an individual, to make a bet where it’s like, “I’m going to gamble my $10 billion and either get $20 billion or $0, with equal probability” would be madness. But from an altruistic point of view, it’s not so crazy. Maybe that’s an even bet, but you should be much more open to making radical gambles like that.
Hey David, yep not our finest moment, that’s for sure.
The critique writes itself so let me offer some partial explanation:
Extemporaneous speech is full of imprecision like this where someone is focused on highlighting one point (in this case the contrast between appropriate individual vs altruistic risk aversion) and misses others. With close scrutiny I’m sure you could find many other cases of me presenting ideas as badly as that, and I’d imagine the same is true for all interview shows edited at the same level as ours.
Fortunately one upside of the conversation format is I think people don’t give it undue weight, because they accurately perceive it as being scrappy in this way. (That said, I certainly do wish I had been more careful here and hopefully alarm bells will be more likely to go off in my head in a future similar case!)
I don’t recall people criticising this passage earlier, and I suspect that’s because prior to the FTX crash it was natural to interpret it less literally and as more pointing towards a general issue.
You can hear that with the $10b vs $0/20b comparison as soon as I said it I realised it wasn’t right and wanted to pare it back (“Maybe that’s an even bet”), because there’s no expected financial gain there. I should have compared it against $5b or something but couldn’t come up with the right number on the spot.
I was primarily trying to think in terms of the sorts of sums the great majority of listeners could end up dealing with, which is only very rarely above >$1b, which led me to add “or not really on the scale of the amount of money that any one person can make”.
If you’d criticised me for saying this in May I would have said that I was highlighting the aspect of the issue that was novel and relevant for most listeners, and that by the time someone is a billionaire donor they will have / should have already gotten individualised advice and not be relying on an introductory interview like this to guide them. They’re also likely to have become aware of the risk aversion issue just through personal experience and common sense (all the super-donors I know of certainly are aware of these issues, though I’m sure they each give it different weight).
All that said, the above passage is pretty cringe, and hopefully this experience will help us learn to steer clear of similar mistakes in future.
Seems worthwhile to quote the relevant bit of the interview:
====
Sam Bankman-Fried: If your goal is to have impact on the world — and in particular if your goal is to maximize the amount of impact that you have on the world — that has pretty strong implications for what you end up doing. Among other things, if you really are trying to maximize your impact, then at what point do you start hitting decreasing marginal returns? Well, in terms of doing good, there’s no such thing: more good is more good. It’s not like you did some good, so good doesn’t matter anymore. But how about money? Are you able to donate so much that money doesn’t matter anymore? And the answer is, I don’t exactly know. But you’re thinking about the scale of the world there, right? At what point are you out of ways for the world to spend money to change?
Sam Bankman-Fried: There’s eight billion people. Government budgets run in the tens of trillions per year. It’s a really massive scale. You take one disease, and that’s a billion a year to help mitigate the effects of one tropical disease. So it’s unclear exactly what the answer is, but it’s at least billions per year probably, so at least 100 billion overall before you risk running out of good things to do with money. I think that’s actually a really powerful fact. That means that you should be pretty aggressive with what you’re doing, and really trying to hit home runs rather than just have some impact — because the upside is just absolutely enormous.
Rob Wiblin: Yeah. Our instincts about how much risk to take on are trained on the fact that in day-to-day life, the upside for us as individuals is super limited. Even if you become a millionaire, there’s just only so much incrementally better that your life is going to be — and getting wiped out is very bad by contrast.
Rob Wiblin: But when it comes to doing good, you don’t hit declining returns like that at all. Or not really on the scale of the amount of money that any one person can make. So you kind of want to just be risk neutral. As an individual, to make a bet where it’s like, “I’m going to gamble my $10 billion and either get $20 billion or $0, with equal probability” would be madness. But from an altruistic point of view, it’s not so crazy. Maybe that’s an even bet, but you should be much more open to making radical gambles like that.
Sam Bankman-Fried: Completely agree. …
Hey David, yep not our finest moment, that’s for sure.
The critique writes itself so let me offer some partial explanation:
Extemporaneous speech is full of imprecision like this where someone is focused on highlighting one point (in this case the contrast between appropriate individual vs altruistic risk aversion) and misses others. With close scrutiny I’m sure you could find many other cases of me presenting ideas as badly as that, and I’d imagine the same is true for all interview shows edited at the same level as ours.
Fortunately one upside of the conversation format is I think people don’t give it undue weight, because they accurately perceive it as being scrappy in this way. (That said, I certainly do wish I had been more careful here and hopefully alarm bells will be more likely to go off in my head in a future similar case!)
I don’t recall people criticising this passage earlier, and I suspect that’s because prior to the FTX crash it was natural to interpret it less literally and as more pointing towards a general issue.
You can hear that with the $10b vs $0/20b comparison as soon as I said it I realised it wasn’t right and wanted to pare it back (“Maybe that’s an even bet”), because there’s no expected financial gain there. I should have compared it against $5b or something but couldn’t come up with the right number on the spot.
I was primarily trying to think in terms of the sorts of sums the great majority of listeners could end up dealing with, which is only very rarely above >$1b, which led me to add “or not really on the scale of the amount of money that any one person can make”.
If you’d criticised me for saying this in May I would have said that I was highlighting the aspect of the issue that was novel and relevant for most listeners, and that by the time someone is a billionaire donor they will have / should have already gotten individualised advice and not be relying on an introductory interview like this to guide them. They’re also likely to have become aware of the risk aversion issue just through personal experience and common sense (all the super-donors I know of certainly are aware of these issues, though I’m sure they each give it different weight).
All that said, the above passage is pretty cringe, and hopefully this experience will help us learn to steer clear of similar mistakes in future.
Thanks!