My memory of the podcast (could be wrong, only listened once!) is that Will said that, conditional on error theory being false, his credence in consequentialism, is about 0.5.
I think he meant conditional on error theory being false, and also on not “some moral view we’ve never thought of”.
Here’s a quote of what Will said starting at 01:31:21: “But yeah, I tried to work through my credences once and I think I ended up in like 3% in utilitarianism or something like. I mean large factions go to, you know, people often very surprised by this, but large factions go to, you know, to error theory. So there’s just no correct moral view. Very large faction to like some moral view we’ve never thought of. But even within positive moral views, and like 50-50 on non consequentialism or consequentialism, most people are not consequentialists. I don’t think I’m.”
Overall it seems like Will’s moral views are pretty different from SBF’s (or what SBF presented to Will as his moral views), so I’m still kind of puzzled about how they interacted with each other.
I think he meant conditional on error theory being false, and also on not “some moral view we’ve never thought of”.
Here’s a quote of what Will said starting at 01:31:21: “But yeah, I tried to work through my credences once and I think I ended up in like 3% in utilitarianism or something like. I mean large factions go to, you know, people often very surprised by this, but large factions go to, you know, to error theory. So there’s just no correct moral view. Very large faction to like some moral view we’ve never thought of. But even within positive moral views, and like 50-50 on non consequentialism or consequentialism, most people are not consequentialists. I don’t think I’m.”
Overall it seems like Will’s moral views are pretty different from SBF’s (or what SBF presented to Will as his moral views), so I’m still kind of puzzled about how they interacted with each other.
’also on not “some moral view we’ve never thought of”.’
Oh, actually, that’s right. That does change things a bit.