They definitely have a much smaller weight than what I first assumed: I initially thought all GiveWell did was to make cost-effectiveness analyses of dozens of charities and recommend the most cost-effective ones. It seems that’s completely wrong, and they rely a lot on other criteria and less quantitative or public information.
Thank you—really helpful additional information and very useful to have it confirmed that GiveWell are considered high quality models by the EA community. Really appreciate it.
They definitely have a much smaller weight than what I first assumed: I initially thought all GiveWell did was to make cost-effectiveness analyses of dozens of charities and recommend the most cost-effective ones. It seems that’s completely wrong, and they rely a lot on other criteria and less quantitative or public information.
I think GiveWell actually puts a major weight on their cost-effectiveness analyses. Elie Hassenfeld (co-founder and CEO of GiveWell) mentioned in the Clearer Thinking podcast that (emphasis mine):
ELIE: You know, there’s a lot of attention paid to the numbers, and certainly plenty of high profile institutions were behind the report. I think that, in my experience, GiveWell is one of the few institutions that’s, I don’t know, trying to make decisions based on cost-effectiveness analysis in doing that in a sort of, consistent and principled way. GiveWell cost- effectiveness estimates are not the only input into our decisions to fund malaria programs and deworming programs, there are some other factors, but they’re certainly 80% plus of the case. I think we’re relatively unique in that way. I don’t think there are other groups, certainly I can’t think of any ones as I’m sitting here now, that are using numbers in that same way. In some ways, I think that is why we have real value added in the world, because I don’t think that explicit cost-effectiveness estimates is the only way to give effectively, but it’s certainly a strategy that I think should be employed significantly. I’m glad that we can be the ones to come in and play it.
I would say that GiveWell’s cost-effectiveness analyses are considered excellent (here is a guide from 2019), but they should be taken in context.
From https://www.givewell.org/how-we-work/our-criteria/cost-effectiveness/cost-effectiveness-models “we consider our cost-effectiveness numbers to be extremely rough.”
″There are many limitations to cost-effectiveness estimates, and we do not assess charities only—or primarily—based on their estimated cost-effectiveness.”
And this old blog post: https://blog.givewell.org/2011/08/18/why-we-cant-take-expected-value-estimates-literally-even-when-theyre-unbiased/
They definitely have a much smaller weight than what I first assumed: I initially thought all GiveWell did was to make cost-effectiveness analyses of dozens of charities and recommend the most cost-effective ones. It seems that’s completely wrong, and they rely a lot on other criteria and less quantitative or public information.
You might also be interested in the series: Concerns about AMF from GiveWell reading (especially Part 3 and Part 2)
And this FAQ from 2017 from GiveWell
Thank you—really helpful additional information and very useful to have it confirmed that GiveWell are considered high quality models by the EA community. Really appreciate it.
Hi Lorenzo,
I think GiveWell actually puts a major weight on their cost-effectiveness analyses. Elie Hassenfeld (co-founder and CEO of GiveWell) mentioned in the Clearer Thinking podcast that (emphasis mine):