I don’t think the “3% credence in utilitarianism” is particularly meaningful; doubting the merits of a particular philosophical framework someone uses isn’t an obvious reason to be suspicious of them. Particularly not when Sam ostensibly reached similar conclusions to Will about global priorities, and MacAskill himself has obviously been profoundly influenced by utilitarian philosophers in his goals too.
But I do think there’s one specific area where SBF’s public philosophical statements were extremely alarming even at the time, and he was doing so whilst in “explain EA” mode. That’s when Sam made it quite clear that if he had a 51% chance of doubling world happiness vs a 49% of ending it, he’d accept the bet.… a train to crazytown not many utilitarians would jump on and also one which sounds a lot like how he actually approached everything.
Then again, SBF isn’t a professional philosopher and never claimed to be, other people have said equally dumb stuff and not gambled away billions of other people’s money, and I’m not sure MacAskill himself would even have read or heard Sam utter those words.
Will’s expressed public view on that sort of double or nothing gamble is hard to actually figure out, but it is clearly not as robustly anti as commonsense would require, though it is also clearly a lot LESS positive than SBF’s view that you should obviously take it: https://conversationswithtyler.com/episodes/william-macaskill/
(I haven’t quoted from the interview, because there is no one clear quote expressing Will’s position, text search for “double” and you’ll find the relevant stuff.)
I don’t think the “3% credence in utilitarianism” is particularly meaningful; doubting the merits of a particular philosophical framework someone uses isn’t an obvious reason to be suspicious of them. Particularly not when Sam ostensibly reached similar conclusions to Will about global priorities, and MacAskill himself has obviously been profoundly influenced by utilitarian philosophers in his goals too.
But I do think there’s one specific area where SBF’s public philosophical statements were extremely alarming even at the time, and he was doing so whilst in “explain EA” mode. That’s when Sam made it quite clear that if he had a 51% chance of doubling world happiness vs a 49% of ending it, he’d accept the bet.… a train to crazytown not many utilitarians would jump on and also one which sounds a lot like how he actually approached everything.
Then again, SBF isn’t a professional philosopher and never claimed to be, other people have said equally dumb stuff and not gambled away billions of other people’s money, and I’m not sure MacAskill himself would even have read or heard Sam utter those words.
Will’s expressed public view on that sort of double or nothing gamble is hard to actually figure out, but it is clearly not as robustly anti as commonsense would require, though it is also clearly a lot LESS positive than SBF’s view that you should obviously take it: https://conversationswithtyler.com/episodes/william-macaskill/
(I haven’t quoted from the interview, because there is no one clear quote expressing Will’s position, text search for “double” and you’ll find the relevant stuff.)