Not sure I agree, but then again, there’s no clear nailed-down target to disagree with :p
For particular people’s behaviour in a social environment, there’s a high prior that the true explanation is complex. That doesn’t nail down which complex story we should update towards, so there’s still more probability mass in any individual simpler story than in individual complex stories. But what it does mean is that if someone gives you a complex story, you shouldn’t be surprised that the story is complex and therefore reduce your trust in them—at least not by much.
(Actually, I guess sometimes, if someone gives you a simple story, and the prior on complex true stories is really high, you should distrust them more. )
To be clear, if someone has a complex story for why they did what they did, you can penalise that particular story for its complexity, but you should already be expecting whatever story they produce to be complex. In other words, if your prior distribution over how complex their story will be is nearly equal to your posterior distribution (the complexity of their story roughly fits your expectations), then however much you think complexity should update your trust in people, you should already have been distrusting them approximately that much based on your prior. Conservation of expected evidence!
Distrust complicated stories that don’t have much simpler versions that also make sense, unless they’re pinned down very precisely by the evidence. When two sides of a yes-no question both complain the other side is committing this sin, you now have a serious challenge to your epistemology and you may need to sit down and think about it.
Distrust complicated designs unless you can calculate very precisely how they’ll work or they’ve been validated by a lot of testing on exactly the same problem distribution you’re drawing from.
Mostly, I think EAs are beating themselves up too much about FTX; but separately among the few problems that I think EA actually does have, is producing really lengthy writeups of things, that don’t simplify well and don’t come with tldrs, a la the incentives in the academic paper factory; and that life wisdom that produces distrust of complicated things that don’t simplify well, is produced in part by watching complicated things like FTX implode, and drawing a lesson of (bounded defeasible) complexity-distrust from that.
Not sure I agree, but then again, there’s no clear nailed-down target to disagree with :p
For particular people’s behaviour in a social environment, there’s a high prior that the true explanation is complex. That doesn’t nail down which complex story we should update towards, so there’s still more probability mass in any individual simpler story than in individual complex stories. But what it does mean is that if someone gives you a complex story, you shouldn’t be surprised that the story is complex and therefore reduce your trust in them—at least not by much.
(Actually, I guess sometimes, if someone gives you a simple story, and the prior on complex true stories is really high, you should distrust them more. )
To be clear, if someone has a complex story for why they did what they did, you can penalise that particular story for its complexity, but you should already be expecting whatever story they produce to be complex. In other words, if your prior distribution over how complex their story will be is nearly equal to your posterior distribution (the complexity of their story roughly fits your expectations), then however much you think complexity should update your trust in people, you should already have been distrusting them approximately that much based on your prior. Conservation of expected evidence!
Okay, fine, a couple of caveats:
Distrust complicated stories that don’t have much simpler versions that also make sense, unless they’re pinned down very precisely by the evidence. When two sides of a yes-no question both complain the other side is committing this sin, you now have a serious challenge to your epistemology and you may need to sit down and think about it.
Distrust complicated designs unless you can calculate very precisely how they’ll work or they’ve been validated by a lot of testing on exactly the same problem distribution you’re drawing from.
What are you even talking about?
Mostly, I think EAs are beating themselves up too much about FTX; but separately among the few problems that I think EA actually does have, is producing really lengthy writeups of things, that don’t simplify well and don’t come with tldrs, a la the incentives in the academic paper factory; and that life wisdom that produces distrust of complicated things that don’t simplify well, is produced in part by watching complicated things like FTX implode, and drawing a lesson of (bounded defeasible) complexity-distrust from that.
This seems like a good summary of what you’re saying.
That just says SBF doesn’t like to read books and then it calls him an idiot for that?
Occam’s razor?