It strikes me as much more prevalent for people to be overconfident in their own idiosyncratic opinions. If you see half of people are 90% confident in X and half of people are 90% confident in not-X, then you know on average they are overconfident. That’s how most of the world looks to me.
This seems consistent with Eliezer’s claim that “commenters on the Internet are often overconfident” while EAs and rationalists he interacts with in person are more often underconfident. In Dunning and Kruger’s original experiment, the worst performers were (highly) overconfident, but the best performers were underconfident.
Your warnings that overconfidence and power-grabbing are big issues seem right to me. Eliezer’s written a lot warning about those problems too. My main thought about this is just that different populations can exhibit different social dynamics and different levels of this or that bias; and these can also change over time. Eliezer’s big-picture objection to modesty isn’t “overconfidence and power-grabbing are never major problems, and you should never take big steps to try combat them”; his objection is “biases vary a lot between individuals and groups, and overcorrection in debiasing is commonplace, so it’s important that whatever debiasing heuristics you use be sensitive to context rather than generically endorsing ‘hit the brakes’ or ‘hit the accelerator’”.
He then makes the further claim that top EAs and rationalists as a group are in fact currently more prone to reflexive deference, underconfidence, fear-of-failure, and not-sticking-their-neck-out than to the biases of overconfident startup founders. At least on Eliezer’s view, this should be a claim that we can evaluate empirically, and our observations should then inform how much we push against overconfidence v. underconfidence.
The evolutionary just-so story isn’t really necessary for that critique, though it’s useful to keep in mind if we were originally thinking that humans only have overactive status-grabbing instincts, and don’t also have overactive status-grab-blocking instincts. Overcorrection is already a common problem, but it’s particularly likely if there are psychological drives pushing in both directions.
Cross-posting a reply from FB:
This seems consistent with Eliezer’s claim that “commenters on the Internet are often overconfident” while EAs and rationalists he interacts with in person are more often underconfident. In Dunning and Kruger’s original experiment, the worst performers were (highly) overconfident, but the best performers were underconfident.
Your warnings that overconfidence and power-grabbing are big issues seem right to me. Eliezer’s written a lot warning about those problems too. My main thought about this is just that different populations can exhibit different social dynamics and different levels of this or that bias; and these can also change over time. Eliezer’s big-picture objection to modesty isn’t “overconfidence and power-grabbing are never major problems, and you should never take big steps to try combat them”; his objection is “biases vary a lot between individuals and groups, and overcorrection in debiasing is commonplace, so it’s important that whatever debiasing heuristics you use be sensitive to context rather than generically endorsing ‘hit the brakes’ or ‘hit the accelerator’”.
He then makes the further claim that top EAs and rationalists as a group are in fact currently more prone to reflexive deference, underconfidence, fear-of-failure, and not-sticking-their-neck-out than to the biases of overconfident startup founders. At least on Eliezer’s view, this should be a claim that we can evaluate empirically, and our observations should then inform how much we push against overconfidence v. underconfidence.
The evolutionary just-so story isn’t really necessary for that critique, though it’s useful to keep in mind if we were originally thinking that humans only have overactive status-grabbing instincts, and don’t also have overactive status-grab-blocking instincts. Overcorrection is already a common problem, but it’s particularly likely if there are psychological drives pushing in both directions.
See also: “Do Rational People Exist?”