[Question] Is this quote from SBF aligned with EA?

Here is a quote from SBF from Conversations With Tyler. Emphasis mine for skimmability. https://​​conversationswithtyler.com/​​episodes/​​sam-bankman-fried/​​

COWEN: Okay, but let’s say there’s a game: 51 percent, you double the Earth out somewhere else; 49 percent, it all disappears. Would you play that game? And would you keep on playing that, double or nothing?
BANKMAN-FRIED: With one caveat. Let me give the caveat first, just to be a party pooper, which is, I’m assuming these are noninteracting universes. Is that right? Because to the extent they’re in the same universe, then maybe duplicating doesn’t actually double the value because maybe they would have colonized the other one anyway, eventually.
COWEN: But holding all that constant, you’re actually getting two Earths, but you’re risking a 49 percent chance of it all disappearing.
BANKMAN-FRIED: Again, I feel compelled to say caveats here, like, “How do you really know that’s what’s happening?” Blah, blah, blah, whatever. But that aside, take the pure hypothetical.
COWEN: Then you keep on playing the game. So, what’s the chance we’re left with anything? Don’t I just St. Petersburg paradox you into nonexistence?
BANKMAN-FRIED: Well, not necessarily. Maybe you St. Petersburg paradox into an enormously valuable existence. That’s the other option.

----

1) Is this quote from SBF aligned with EA? (“not particularly aligned or unaligned” is ofc valid response)
2) Regardless of your first answer, can you articulate a value system or system of ethics under which the game described by Tyler is moral to play (ad infinitum), but it is not moral to risk 80% of FTX depositor funds with 75% odds of doubling the money and donating all of it to effective charity (once, much less ad infinitum).

Bonus question: Did you find SBF’s response to this question surprising? Do you think that most leaders in the EA community would have found SBF’s response to this question surprising?

No comments.