I suspect that a crux of the issue about the relative importance of growth vs. epistemic virtue is whether you expect most of the value of the EA community comes from novel insights and research that it does, or through moving money to the things that are already known about.
In the early days of EA I think that GiveWell’s quality was a major factor in getting people to donate, but I think that the EA movement is large enough now that growth isn’t necessarily related to rigor—the largest charities (like Salvation Army or YMCA) don’t seem to be particularly epistemically rigorous at all. I’m not sure how closely the marginal EA is checking claims, and I think that EA is now mainstream enough that more people don’t experience strong social pressure to justify it.
I suspect that a crux of the issue about the relative importance of growth vs. epistemic virtue is whether you expect most of the value of the EA community comes from novel insights and research that it does, or through moving money to the things that are already known about.
In the early days of EA I think that GiveWell’s quality was a major factor in getting people to donate, but I think that the EA movement is large enough now that growth isn’t necessarily related to rigor—the largest charities (like Salvation Army or YMCA) don’t seem to be particularly epistemically rigorous at all. I’m not sure how closely the marginal EA is checking claims, and I think that EA is now mainstream enough that more people don’t experience strong social pressure to justify it.