However, I worry that in the EA community, there’s an overemphasis on the “scout” mindset—being skeptical of one’s own work and too quick to defer to critiques from others.
Perhaps a minor point: the scout mindset encourages skepticism, but not deference. There’s a big difference between deferring to a critique vs. listening to and agreeing with it. I think we should hesitate to describe people as deferring to others unless either (a) they say they are doing so or (b) we have some specific reason to think they can’t be critically analysing the arguments for themselves.
Thanks for the comment, Ben! You’re right that a perfectly applied scout mindset involves critically analyzing information and updating based on evidence, rather than deferring. In theory, someone applying the scout mindset would update the correct amount based on the fact that they have an interest in a certain outcome, without automatically yielding to critiques. However, in practice, I think there’s a tendency within EA to celebrate the relinquishing of positions, almost as a marker of intellectual humility or objectivity.
This can create a culture where people may feel pressure to seem “scouty” by yielding more often than is optimal, even when the epistemological ecosystem might actually need them to advocate for the value of their intervention or program. In such cases, the desire to appear unbiased or intellectually humble could lead people to abandon or underplay their projects prematurely, which could be a loss for the broader system.
It’s a subtle difference, but I think it’s worth considering how the scout mindset is applied in practice, especially when there’s a risk of overcorrecting in the direction of giving up rather than pushing for the potential value of one’s work.
I think “scout mindset” vs “soldier mindset” in individuals is the wrong thing to be focusing on in general (. You will never succeed in making individuals perfectly unbiased. In science, plenty of people with “soldier mindset” do great work and make great discoveries.
What matters is that the system as a whole is epistemologically healthy and has mechanisms to successfully counteract people’s biases. A “soldier” in science is still meant to be honest and argue for their views with evidence and experimentation, and other scientists are incentivized to probe their arguments for weaknesses.
A culture where less people quit of their own accord, but more people are successfully pressured to leave due to high levels of skeptical scrutiny might be superior.
Perhaps a minor point: the scout mindset encourages skepticism, but not deference. There’s a big difference between deferring to a critique vs. listening to and agreeing with it. I think we should hesitate to describe people as deferring to others unless either (a) they say they are doing so or (b) we have some specific reason to think they can’t be critically analysing the arguments for themselves.
Thanks for the comment, Ben! You’re right that a perfectly applied scout mindset involves critically analyzing information and updating based on evidence, rather than deferring. In theory, someone applying the scout mindset would update the correct amount based on the fact that they have an interest in a certain outcome, without automatically yielding to critiques. However, in practice, I think there’s a tendency within EA to celebrate the relinquishing of positions, almost as a marker of intellectual humility or objectivity.
This can create a culture where people may feel pressure to seem “scouty” by yielding more often than is optimal, even when the epistemological ecosystem might actually need them to advocate for the value of their intervention or program. In such cases, the desire to appear unbiased or intellectually humble could lead people to abandon or underplay their projects prematurely, which could be a loss for the broader system.
It’s a subtle difference, but I think it’s worth considering how the scout mindset is applied in practice, especially when there’s a risk of overcorrecting in the direction of giving up rather than pushing for the potential value of one’s work.
I think “scout mindset” vs “soldier mindset” in individuals is the wrong thing to be focusing on in general (. You will never succeed in making individuals perfectly unbiased. In science, plenty of people with “soldier mindset” do great work and make great discoveries.
What matters is that the system as a whole is epistemologically healthy and has mechanisms to successfully counteract people’s biases. A “soldier” in science is still meant to be honest and argue for their views with evidence and experimentation, and other scientists are incentivized to probe their arguments for weaknesses.
A culture where less people quit of their own accord, but more people are successfully pressured to leave due to high levels of skeptical scrutiny might be superior.