The wrong tool for many.… Some people accomplish a lot of good by being overconfident.
But Holden, rationalists should win. If you can do good by being overconfident, then bayesian habits can and should endorse overconfidence.
Since “The Bayesian Mindset” broadly construed is all about calibrating confidence, that might sound like a contradiction, but it shouldn’t. Overconfidence is an attitude, not an epistemic state.
I like the idea of the self-fulfilling prophecy point, and expect prediction markets to work that way, but am still not sure if that’s the outcome I’d actually expect.
I think it’s clearly true that there are at least some situations where dramatic overconfidence (above the self-fulfilling prophesy line) would make sense.
That said, these situations might be quite contrived, and in real life, the benefits of the extra accuracy in many situations might outweigh the costs in a few.
Entrepreneurs clearly gain specific benefits from portraying insanely optimistic stories, but perhaps more rational ones would lose some of these benefits but gain others.
One could definitely argue that you could use the bayesian mindset to decide not to use the bayesian mindset in some settings, which is already definitely the case (there are many situations where it’s just too expensive, for example). Similar to how it’s possible to use a good decision theory in order to agree to use “insane decision theory X” in “insane situation where using insane decision theory X is optimal”.
It might be true that the right expected utility calculation would endorse being overconfident, but “Bayesian mindset” isn’t about behaving like a theoretically ideal utility maximizer—it’s about actually writing down probabilities and values and taking action based on those. I think trying to actually make decisions this way is a very awkward fit with an overconfident attitude: even if the equation you write down says you’ll do best by feeling overconfident, that might be tough in practice.
The tension between overconfidence and rigorous thinking is overrated:
Swisher: Do you take criticism to heart correctly?
Elon: Yes.
Swisher: Give me an example of something if you could.
Elon: How do you think rockets get to orbit?
Swisher: That’s a fair point.
Elon: Not easily. Physics is very demanding. If you get it wrong, the rocket will blow up. Cars are very demanding. If you get it wrong, a car won’t work. Truth in engineering and science is extremely important.
But Holden, rationalists should win. If you can do good by being overconfident, then bayesian habits can and should endorse overconfidence.
Since “The Bayesian Mindset” broadly construed is all about calibrating confidence, that might sound like a contradiction, but it shouldn’t. Overconfidence is an attitude, not an epistemic state.
I disagree, Bayesian habits would lead one to the self-fulfilling prophecy point.
I like the idea of the self-fulfilling prophecy point, and expect prediction markets to work that way, but am still not sure if that’s the outcome I’d actually expect.
I think it’s clearly true that there are at least some situations where dramatic overconfidence (above the self-fulfilling prophesy line) would make sense.
That said, these situations might be quite contrived, and in real life, the benefits of the extra accuracy in many situations might outweigh the costs in a few.
Entrepreneurs clearly gain specific benefits from portraying insanely optimistic stories, but perhaps more rational ones would lose some of these benefits but gain others.
One could definitely argue that you could use the bayesian mindset to decide not to use the bayesian mindset in some settings, which is already definitely the case (there are many situations where it’s just too expensive, for example). Similar to how it’s possible to use a good decision theory in order to agree to use “insane decision theory X” in “insane situation where using insane decision theory X is optimal”.
This is great! What tools did you use to draw this?
Hey, thanks, https://excalidraw.com/
It might be true that the right expected utility calculation would endorse being overconfident, but “Bayesian mindset” isn’t about behaving like a theoretically ideal utility maximizer—it’s about actually writing down probabilities and values and taking action based on those. I think trying to actually make decisions this way is a very awkward fit with an overconfident attitude: even if the equation you write down says you’ll do best by feeling overconfident, that might be tough in practice.
The tension between overconfidence and rigorous thinking is overrated:
Source and previous discussion.