It strikes me as much more prevalent for people to be overconfident in their own idiosyncratic opinions. If you see half of people are 90% confident in X and half of people are 90% confident in not-X, then you know on average they are overconfident. That’s how most of the world looks to me.
But no matter—they probably won’t suffer much, because the meek do no inherit the Earth, at least not in this life.
People follow confidence in leaders, generating the pathological start-up founder who is sure they’re 100x more likely to succeed than the base rate; someone who portrays themselves as especially competent in a job interview is more likely to be hired than someone who accurately appraises their merits; and I don’t imagine deferring to a boring consensus brings more romantic success than elaborating on one’s exciting contrarian opinions.
Given all this, it’s unsurprising evolution has programmed us to place an astonishingly high weight on our own judgement.
While there are some social downsides to seeming arrogant, people who preach modesty here advocate going well beyond what’s required to avoid triggering an anti-dominance reaction in others.
Indeed, even though I think strong modesty is epistemically the correct approach on the basis of reasoned argument, I do not and can not consistently live and speak that way, because all my personal incentives are lined up in favour of me portraying myself as very confident in my inside view.
In my experience it requires a monastic discipline to do otherwise, a discipline almost none possess.
It strikes me as much more prevalent for people to be overconfident in their own idiosyncratic opinions. If you see half of people are 90% confident in X and half of people are 90% confident in not-X, then you know on average they are overconfident. That’s how most of the world looks to me.
This seems consistent with Eliezer’s claim that “commenters on the Internet are often overconfident” while EAs and rationalists he interacts with in person are more often underconfident. In Dunning and Kruger’s original experiment, the worst performers were (highly) overconfident, but the best performers were underconfident.
Your warnings that overconfidence and power-grabbing are big issues seem right to me. Eliezer’s written a lot warning about those problems too. My main thought about this is just that different populations can exhibit different social dynamics and different levels of this or that bias; and these can also change over time. Eliezer’s big-picture objection to modesty isn’t “overconfidence and power-grabbing are never major problems, and you should never take big steps to try combat them”; his objection is “biases vary a lot between individuals and groups, and overcorrection in debiasing is commonplace, so it’s important that whatever debiasing heuristics you use be sensitive to context rather than generically endorsing ‘hit the brakes’ or ‘hit the accelerator’”.
He then makes the further claim that top EAs and rationalists as a group are in fact currently more prone to reflexive deference, underconfidence, fear-of-failure, and not-sticking-their-neck-out than to the biases of overconfident startup founders. At least on Eliezer’s view, this should be a claim that we can evaluate empirically, and our observations should then inform how much we push against overconfidence v. underconfidence.
The evolutionary just-so story isn’t really necessary for that critique, though it’s useful to keep in mind if we were originally thinking that humans only have overactive status-grabbing instincts, and don’t also have overactive status-grab-blocking instincts. Overcorrection is already a common problem, but it’s particularly likely if there are psychological drives pushing in both directions.
It strikes me as much more prevalent for people to be overconfident in their own idiosyncratic opinions. If you see half of people are 90% confident in X and half of people are 90% confident in not-X, then you know on average they are overconfident. That’s how most of the world looks to me.
But no matter—they probably won’t suffer much, because the meek do no inherit the Earth, at least not in this life.
People follow confidence in leaders, generating the pathological start-up founder who is sure they’re 100x more likely to succeed than the base rate; someone who portrays themselves as especially competent in a job interview is more likely to be hired than someone who accurately appraises their merits; and I don’t imagine deferring to a boring consensus brings more romantic success than elaborating on one’s exciting contrarian opinions.
Given all this, it’s unsurprising evolution has programmed us to place an astonishingly high weight on our own judgement.
While there are some social downsides to seeming arrogant, people who preach modesty here advocate going well beyond what’s required to avoid triggering an anti-dominance reaction in others.
Indeed, even though I think strong modesty is epistemically the correct approach on the basis of reasoned argument, I do not and can not consistently live and speak that way, because all my personal incentives are lined up in favour of me portraying myself as very confident in my inside view.
In my experience it requires a monastic discipline to do otherwise, a discipline almost none possess.
Cross-posting a reply from FB:
This seems consistent with Eliezer’s claim that “commenters on the Internet are often overconfident” while EAs and rationalists he interacts with in person are more often underconfident. In Dunning and Kruger’s original experiment, the worst performers were (highly) overconfident, but the best performers were underconfident.
Your warnings that overconfidence and power-grabbing are big issues seem right to me. Eliezer’s written a lot warning about those problems too. My main thought about this is just that different populations can exhibit different social dynamics and different levels of this or that bias; and these can also change over time. Eliezer’s big-picture objection to modesty isn’t “overconfidence and power-grabbing are never major problems, and you should never take big steps to try combat them”; his objection is “biases vary a lot between individuals and groups, and overcorrection in debiasing is commonplace, so it’s important that whatever debiasing heuristics you use be sensitive to context rather than generically endorsing ‘hit the brakes’ or ‘hit the accelerator’”.
He then makes the further claim that top EAs and rationalists as a group are in fact currently more prone to reflexive deference, underconfidence, fear-of-failure, and not-sticking-their-neck-out than to the biases of overconfident startup founders. At least on Eliezer’s view, this should be a claim that we can evaluate empirically, and our observations should then inform how much we push against overconfidence v. underconfidence.
The evolutionary just-so story isn’t really necessary for that critique, though it’s useful to keep in mind if we were originally thinking that humans only have overactive status-grabbing instincts, and don’t also have overactive status-grab-blocking instincts. Overcorrection is already a common problem, but it’s particularly likely if there are psychological drives pushing in both directions.
See also: “Do Rational People Exist?”