I like the core point and think itâs very important â though I donât really vibe with statements about calibration being actively dangerous.
I think EA culture can make it seem like being calibrated is the most important thing ever. But I think on the topic of âwill my ambitious projects succeed?â it seems very difficult to be calibrated and fairly cursed overall, and it may overall be unhelpful to try super hard at this vs. just executing.
For example, Iâm guessing that Norman Borlaug didnât feed a billion people primarily by being extremely well-calibrated. I think he did it via being a good scientist, dedicating himself fully to something impactful-in-principle even when the way forward was unclear, and being willing to do things outside his normal wheelhouse â like bureaucracy or engaging with Indian government officials. Iâd guess he was well-calibrated about micro-aspects of his wheat germination work, such as which experiments were likely to work out, or perhaps which politicians would listen to him (but on the other hand, he could simply have been uncalibrated and very persistent). I wouldnât expect heâd be well-calibrated about the overall shape of his career early on, and it doesnât seem very important for him to have been calibrated about that.
One often hears about successful political candidates that they always had unwarranted-seeming confidence in themselves and always thought theyâd win office. Iâve noticed that the most successful researchers tend to seem a bit âcrazyâ and have unwarranted confidence in their own work. Successful startup founders too are not exactly known for realistic ex-ante estimates of their own success. (Of course this all applies to many unsuccessful political candidates, researchers and founders as well.)
I think something psychologically important is going on here; my guess is that âpart of youâ really needs to believe in outsized success in order to have a chance of achieving it. This old Nate Soares post is relevant.
I like the core point and think itâs very important â though I donât really vibe with statements about calibration being actively dangerous.
I think EA culture can make it seem like being calibrated is the most important thing ever. But I think on the topic of âwill my ambitious projects succeed?â it seems very difficult to be calibrated and fairly cursed overall, and it may overall be unhelpful to try super hard at this vs. just executing.
For example, Iâm guessing that Norman Borlaug didnât feed a billion people primarily by being extremely well-calibrated. I think he did it via being a good scientist, dedicating himself fully to something impactful-in-principle even when the way forward was unclear, and being willing to do things outside his normal wheelhouse â like bureaucracy or engaging with Indian government officials. Iâd guess he was well-calibrated about micro-aspects of his wheat germination work, such as which experiments were likely to work out, or perhaps which politicians would listen to him (but on the other hand, he could simply have been uncalibrated and very persistent). I wouldnât expect heâd be well-calibrated about the overall shape of his career early on, and it doesnât seem very important for him to have been calibrated about that.
One often hears about successful political candidates that they always had unwarranted-seeming confidence in themselves and always thought theyâd win office. Iâve noticed that the most successful researchers tend to seem a bit âcrazyâ and have unwarranted confidence in their own work. Successful startup founders too are not exactly known for realistic ex-ante estimates of their own success. (Of course this all applies to many unsuccessful political candidates, researchers and founders as well.)
I think something psychologically important is going on here; my guess is that âpart of youâ really needs to believe in outsized success in order to have a chance of achieving it. This old Nate Soares post is relevant.
These are great thoughts, thank you so much for the comment Eli!