I think generally I agree with you, that people should be careful to not pull the trigger too early on closing down a project. However, I think in the general philanthropic landscape, organizations persist because of the lack of incentives to close them down, which is of course, inefficient. EA does a good job trying to correct this, but like with other areas of EA, it is possible that EA takes it “too far”.
I tend to think the people involved are most equipped to make this determination, and we have additional reason to trust their judgment because it likely goes against their self-interest to close a project down.
I think a related discussion could be had around funders making the decision to quit on projects too early, which is likely much more prevalent/an issue.
And as an aside—I am interested in this topic for a research project. I think doing some qualitative analysis (interviews?) with folks who have closed down projects would make for a fairly interesting research paper.
I agree that funders can sometimes quit too early on promising projects. Founders and those directly involved often play a crucial role in pushing through early setbacks. However, I worry that in the EA community, there’s an overemphasis on the “scout” mindset—being skeptical of one’s own work and too quick to defer to critiques from others.
I should acknowledge that I may be strawmanning the scout mindset a bit here. In its ideal form, the scout mindset encourages intellectual humility and a willingness to see things as they are, which is clearly valuable. The practical application, however, often leads people to focus on being extra vigilant about potential biases in projects they’re closely involved with. While this caution is important, I think there’s a risk that it prevents people from taking the necessary “inside view” to successfully lead new or risky projects.
Many successful endeavors require a degree of tenacity and advocacy from the inside—the willingness to believe in a project’s potential and push forward despite early doubts. In systems like the legal world or even competitive markets, having “soldiers” who advocate strongly for their side often leads to better overall outcomes. Within EA, founders and project leaders can play this same balancing role by fighting for their initiatives when external doubts arise. Without this advocacy, projects with high long-term potential might be abandoned too soon.
The soldier mindset may be particularly important for those leading high-risk, high-reward projects. Individuals like Elon Musk at SpaceX or Jeff Bezos at Amazon had to persist through early failures because they believed in their projects when others didn’t. Their success depended on taking the inside view and pushing forward despite setbacks. If founders in EA are too quick to adopt the scout mindset, we might lose out on similarly promising projects.
In short, while the scout mindset has its place, I think we need a balance. Founders and those deeply involved in a project should serve as a necessary counterweight to broader skepticism. A healthy epistemic system requires both scouts and soldiers, and right now, I think EA might benefit from more of the latter in some contexts. I’d also be interested in case studies of people who have quit projects to better understand these dynamics. Your research idea sounds like a great way to explore whether we’re underestimating the value of persistence in high-risk, high-reward initiatives.
However, I worry that in the EA community, there’s an overemphasis on the “scout” mindset—being skeptical of one’s own work and too quick to defer to critiques from others.
Perhaps a minor point: the scout mindset encourages skepticism, but not deference. There’s a big difference between deferring to a critique vs. listening to and agreeing with it. I think we should hesitate to describe people as deferring to others unless either (a) they say they are doing so or (b) we have some specific reason to think they can’t be critically analysing the arguments for themselves.
Thanks for the comment, Ben! You’re right that a perfectly applied scout mindset involves critically analyzing information and updating based on evidence, rather than deferring. In theory, someone applying the scout mindset would update the correct amount based on the fact that they have an interest in a certain outcome, without automatically yielding to critiques. However, in practice, I think there’s a tendency within EA to celebrate the relinquishing of positions, almost as a marker of intellectual humility or objectivity.
This can create a culture where people may feel pressure to seem “scouty” by yielding more often than is optimal, even when the epistemological ecosystem might actually need them to advocate for the value of their intervention or program. In such cases, the desire to appear unbiased or intellectually humble could lead people to abandon or underplay their projects prematurely, which could be a loss for the broader system.
It’s a subtle difference, but I think it’s worth considering how the scout mindset is applied in practice, especially when there’s a risk of overcorrecting in the direction of giving up rather than pushing for the potential value of one’s work.
I think “scout mindset” vs “soldier mindset” in individuals is the wrong thing to be focusing on in general (. You will never succeed in making individuals perfectly unbiased. In science, plenty of people with “soldier mindset” do great work and make great discoveries.
What matters is that the system as a whole is epistemologically healthy and has mechanisms to successfully counteract people’s biases. A “soldier” in science is still meant to be honest and argue for their views with evidence and experimentation, and other scientists are incentivized to probe their arguments for weaknesses.
A culture where less people quit of their own accord, but more people are successfully pressured to leave due to high levels of skeptical scrutiny might be superior.
I would go way further than this. Most (95 percent plus) charity organizations are more concerned with continuing than anything else, and closure is only forced by money running out or a big personality who drives it leaving. Effectiveness, let alone cost-effectiveness is very rarely a consideration.
Here in Gulu, Northern UgAnda I can’t think of 1 org in 10 years that closed because they thought they weren’t doing enough good. Our of hundreds operating.
I think a related discussion could be had around funders making the decision to quit on projects too early, which is likely much more prevalent/an issue.
The lack of incentives to write posts criticizing one’s former funders for pulling the plug early may be a challenge, though. After all, one may be looking to them for the next project. And writing such a post may not generate the positive community feeling that writing an auto-shutdown postmortem does.
I think generally I agree with you, that people should be careful to not pull the trigger too early on closing down a project. However, I think in the general philanthropic landscape, organizations persist because of the lack of incentives to close them down, which is of course, inefficient. EA does a good job trying to correct this, but like with other areas of EA, it is possible that EA takes it “too far”.
I tend to think the people involved are most equipped to make this determination, and we have additional reason to trust their judgment because it likely goes against their self-interest to close a project down.
I think a related discussion could be had around funders making the decision to quit on projects too early, which is likely much more prevalent/an issue.
And as an aside—I am interested in this topic for a research project. I think doing some qualitative analysis (interviews?) with folks who have closed down projects would make for a fairly interesting research paper.
I agree that funders can sometimes quit too early on promising projects. Founders and those directly involved often play a crucial role in pushing through early setbacks. However, I worry that in the EA community, there’s an overemphasis on the “scout” mindset—being skeptical of one’s own work and too quick to defer to critiques from others.
I should acknowledge that I may be strawmanning the scout mindset a bit here. In its ideal form, the scout mindset encourages intellectual humility and a willingness to see things as they are, which is clearly valuable. The practical application, however, often leads people to focus on being extra vigilant about potential biases in projects they’re closely involved with. While this caution is important, I think there’s a risk that it prevents people from taking the necessary “inside view” to successfully lead new or risky projects.
Many successful endeavors require a degree of tenacity and advocacy from the inside—the willingness to believe in a project’s potential and push forward despite early doubts. In systems like the legal world or even competitive markets, having “soldiers” who advocate strongly for their side often leads to better overall outcomes. Within EA, founders and project leaders can play this same balancing role by fighting for their initiatives when external doubts arise. Without this advocacy, projects with high long-term potential might be abandoned too soon.
The soldier mindset may be particularly important for those leading high-risk, high-reward projects. Individuals like Elon Musk at SpaceX or Jeff Bezos at Amazon had to persist through early failures because they believed in their projects when others didn’t. Their success depended on taking the inside view and pushing forward despite setbacks. If founders in EA are too quick to adopt the scout mindset, we might lose out on similarly promising projects.
In short, while the scout mindset has its place, I think we need a balance. Founders and those deeply involved in a project should serve as a necessary counterweight to broader skepticism. A healthy epistemic system requires both scouts and soldiers, and right now, I think EA might benefit from more of the latter in some contexts. I’d also be interested in case studies of people who have quit projects to better understand these dynamics. Your research idea sounds like a great way to explore whether we’re underestimating the value of persistence in high-risk, high-reward initiatives.
Perhaps a minor point: the scout mindset encourages skepticism, but not deference. There’s a big difference between deferring to a critique vs. listening to and agreeing with it. I think we should hesitate to describe people as deferring to others unless either (a) they say they are doing so or (b) we have some specific reason to think they can’t be critically analysing the arguments for themselves.
Thanks for the comment, Ben! You’re right that a perfectly applied scout mindset involves critically analyzing information and updating based on evidence, rather than deferring. In theory, someone applying the scout mindset would update the correct amount based on the fact that they have an interest in a certain outcome, without automatically yielding to critiques. However, in practice, I think there’s a tendency within EA to celebrate the relinquishing of positions, almost as a marker of intellectual humility or objectivity.
This can create a culture where people may feel pressure to seem “scouty” by yielding more often than is optimal, even when the epistemological ecosystem might actually need them to advocate for the value of their intervention or program. In such cases, the desire to appear unbiased or intellectually humble could lead people to abandon or underplay their projects prematurely, which could be a loss for the broader system.
It’s a subtle difference, but I think it’s worth considering how the scout mindset is applied in practice, especially when there’s a risk of overcorrecting in the direction of giving up rather than pushing for the potential value of one’s work.
I think “scout mindset” vs “soldier mindset” in individuals is the wrong thing to be focusing on in general (. You will never succeed in making individuals perfectly unbiased. In science, plenty of people with “soldier mindset” do great work and make great discoveries.
What matters is that the system as a whole is epistemologically healthy and has mechanisms to successfully counteract people’s biases. A “soldier” in science is still meant to be honest and argue for their views with evidence and experimentation, and other scientists are incentivized to probe their arguments for weaknesses.
A culture where less people quit of their own accord, but more people are successfully pressured to leave due to high levels of skeptical scrutiny might be superior.
I would go way further than this. Most (95 percent plus) charity organizations are more concerned with continuing than anything else, and closure is only forced by money running out or a big personality who drives it leaving. Effectiveness, let alone cost-effectiveness is very rarely a consideration.
Here in Gulu, Northern UgAnda I can’t think of 1 org in 10 years that closed because they thought they weren’t doing enough good. Our of hundreds operating.
The lack of incentives to write posts criticizing one’s former funders for pulling the plug early may be a challenge, though. After all, one may be looking to them for the next project. And writing such a post may not generate the positive community feeling that writing an auto-shutdown postmortem does.