I think GiveWell is a pretty confusing organisation in some ways (e.g. on their site I think they claim to care about things that arenât just cost-effectiveness -, but to me, it clearly seems like you should just care about cost-effectiveness and whilst some of the things that GiveWell says that care about might be good proxies/âsignals they really are (imo) only to be used as proxies or signals of cost-effectiveness.
I very much agree cost-effectiveness is all that matters, in the sense that one would ideally maximise the benefits for a given amount of resources. Of course, as you point out, this does not mean the cost-effectiveness number outputted by the spreadsheet is all that matters! It is often hard to formalise all factors which influence cost-effectiveness, and so the actual cost-effectiveness estimates one obtains are not everything. Nevertheless, my impression is that GiveWellâs cost-effectiveness estimates are pretty close to encompassing all their thinking. Elie mentioned on the Clearer Thinking podcast that:
GiveWell cost- effectiveness estimates are not the only input into our decisions to fund malaria programs and deworming programs, there are some other factors, but theyâre certainly 80% plus of the case.
This is in line with what you said about Open Phil and GiveWell agreeing with you.
Iâm also a little surprised that you think the GWWC GHW fund is a more reasonable option than the GHDF, it seems to me to have a much shorter track record and Iâm pretty confused about the HLI rec
Welcomes further evaluation of the process behind the recommendations of GiveWell and other evaluators in the global health and wellbeing space (e.g. Happier Lives Institute), trusts GWWCâs research team to identify evaluators to rely on, and wants the evaluations to be published, as in GWWCâs evaluations of evaluators. These would be my main reasons for donating to GHWF instead of GHDF, which has not produced public evaluations of GiveWellâs recommendations.
From your comment, it sounds like you have some concerns about GiveWellâs recommendation process, which I think would be worth expanding on more publicly.
[âaccepting applicationsâ] seems reasonable but this mostly hinges on are there people that we trust to evaluate the applications and tbh I havenât come across many people who I think would do an excellent job relative to GiveWell recsâif you think there are lots of excellent people I am very interested in hearing names.
I guess a public call might be helpful to find such people. Rethink Priorities might be open to doing some evaluations? They have been commisioned by GiveWell, and have experience incubating new projects.
I do think that my bar for funding something is higher than other GHW donors
Do you have a cost-effectiveness bar as a fraction of the cost-effectiveness of GiveDirectly? It may be better to be explicit about it. GiveWellâs is 10, and I believe Open Philâs is 20.
I think lots of successful GHW CE charities (some of which are run by people I really admire and are personal friends of mine) are very unlikely to beat the very best other donation options I am aware ofâand imo the main purpose of funding these kinds of narrow GHW charities it almost entirely for the value of discovering that it beats the very best GHW charities we are already aware of + have room for funding.
Fair points. Maybe such charities will eventually (not initially) go on to use funds which would otherwise have gone to less effective charities?
Nevertheless, my impression is that GiveWellâs cost-effectiveness estimates are pretty close to encompassing all their thinking. Elie mentioned on the Clearer Thinking podcast that:
Fwiw I feel quite confused about how different GiveWellâs recommendations would be if they were solely optimising for cost-effectiveness, I have heard different versions of how much they are optimising for this already based on different people that I speak to (and my impression is that most public materials do not say they are solely optimising for this).
Do you have a cost-effectiveness bar as a fraction of the cost-effectiveness of GiveDirectly? It may be better to be explicit about it. GiveWellâs is 10, and I believe Open Philâs is 20.
I think the bar for the first dollar should be better than the best charity Iâm aware of that has room for funding (which is probably at least 20x cashâIâd guess higher). The bar for the last dollar is a bit confusing because of funding effects.
Hi Vasco and Caleb, we appreciate the interest in the Global Health and Development Fund! This is Isabel Arjmand responding on behalf of GiveWell.
Weâre grateful for the opportunity to manage this fund, and we think itâs a great opportunity for donors who want to support highly cost-effective global health and development programs. Weâre also interested in having more in-depth conversations with Caleb and others involved in EA Funds about what the future of this fund should look like, and weâll reach out to schedule that.
In the meantime, here are some notes on our grantmaking and how donations to the fund are currently used.
We expect the impact of giving to the Global Health and Development Fund (GHDF) is about the same as giving to GiveWellâs All Grants Fund: both go to the most impactful opportunities weâve identified (across programs and organizations), and are a good fit for donors whoâd like to support the full range of our grantmaking, including higher-risk grants and research. The online description of GHDF was written before the All Grants Fund existed (it launched in 2022), and the two funds are now filling a very similar niche. Caleb, weâd love to collaborate on updating the GHDF webpage to both reflect the existence of the All Grants Fund and include more recent grant payout reports.
In the broadest sense, GiveWell aims to maximize impact per dollar. Cost-effectiveness is the primary driver of our grantmaking decisions. But, âoverall estimated cost-effectiveness of a grantâ isnât the same thing as âoutput of cost-effectiveness analysis spreadsheet.â (This blog post is old and not entirely reflective of our current approach, but it covers a similar topic.)
The numerical cost-effectiveness estimate in the spreadsheet is nearly always the most important factor in our recommendations, but not the only factor. That is, we donât solely rely on our spreadsheet-based analysis of cost-effectiveness when making grants.
We donât have an institutional position on exactly how much of the decision comes down to the spreadsheet analysis (though Elieâs take of â80% plusâ definitely seems reasonable!) and it varies by grant, but many of the factors we consider outside our models (e.g. qualitative factors about an organization) are in the service of making impact-oriented decisions. See this post for more discussion.
For a small number of grants, the case for the grant relies heavily on factors other than expected impact of that grant per se. For example, we sometimes make exit grants in order to be a responsible funder and treat partner organizations considerately even if we think funding could be used more cost-effectively elsewhere.
To add something to our top charities list (vs. make a grant from the All Grants Fund /â GHDF), we want a high degree of confidence in the program. See our list of additional criteria for top charities here; some of those criteria arenât proxies for cost-effectiveness, but are instead capturing whether a program provides the confidence and direct case for impact that donors expect from that product.
Also, we recognize it was confusing to have GiveDirectly on our top charity list when we believed our other top charities were substantially more cost-effective. Now, our list of top charities is limited to the programs that we think can most cost-effectively use marginal funding (currently, programs we believe to have room for more funding that is at least 10x unconditional cash transfers); see the fourth bullet point here.
Thanks for the clarifying comments, Caleb!
I very much agree cost-effectiveness is all that matters, in the sense that one would ideally maximise the benefits for a given amount of resources. Of course, as you point out, this does not mean the cost-effectiveness number outputted by the spreadsheet is all that matters! It is often hard to formalise all factors which influence cost-effectiveness, and so the actual cost-effectiveness estimates one obtains are not everything. Nevertheless, my impression is that GiveWellâs cost-effectiveness estimates are pretty close to encompassing all their thinking. Elie mentioned on the Clearer Thinking podcast that:
This is in line with what you said about Open Phil and GiveWell agreeing with you.
Thanks for questioning! I have clarified my reasons a little more updating the 1st 2 bullets of the section Case for donating to Giving What We Canâs Global Health and Wellbeing Fund to:
From your comment, it sounds like you have some concerns about GiveWellâs recommendation process, which I think would be worth expanding on more publicly.
I guess a public call might be helpful to find such people. Rethink Priorities might be open to doing some evaluations? They have been commisioned by GiveWell, and have experience incubating new projects.
Do you have a cost-effectiveness bar as a fraction of the cost-effectiveness of GiveDirectly? It may be better to be explicit about it. GiveWellâs is 10, and I believe Open Philâs is 20.
Fair points. Maybe such charities will eventually (not initially) go on to use funds which would otherwise have gone to less effective charities?
Fwiw I feel quite confused about how different GiveWellâs recommendations would be if they were solely optimising for cost-effectiveness, I have heard different versions of how much they are optimising for this already based on different people that I speak to (and my impression is that most public materials do not say they are solely optimising for this).
I think the bar for the first dollar should be better than the best charity Iâm aware of that has room for funding (which is probably at least 20x cashâIâd guess higher). The bar for the last dollar is a bit confusing because of funding effects.
Hi Vasco and Caleb, we appreciate the interest in the Global Health and Development Fund! This is Isabel Arjmand responding on behalf of GiveWell.
Weâre grateful for the opportunity to manage this fund, and we think itâs a great opportunity for donors who want to support highly cost-effective global health and development programs. Weâre also interested in having more in-depth conversations with Caleb and others involved in EA Funds about what the future of this fund should look like, and weâll reach out to schedule that.
In the meantime, here are some notes on our grantmaking and how donations to the fund are currently used.
We expect the impact of giving to the Global Health and Development Fund (GHDF) is about the same as giving to GiveWellâs All Grants Fund: both go to the most impactful opportunities weâve identified (across programs and organizations), and are a good fit for donors whoâd like to support the full range of our grantmaking, including higher-risk grants and research. The online description of GHDF was written before the All Grants Fund existed (it launched in 2022), and the two funds are now filling a very similar niche. Caleb, weâd love to collaborate on updating the GHDF webpage to both reflect the existence of the All Grants Fund and include more recent grant payout reports.
In the broadest sense, GiveWell aims to maximize impact per dollar. Cost-effectiveness is the primary driver of our grantmaking decisions. But, âoverall estimated cost-effectiveness of a grantâ isnât the same thing as âoutput of cost-effectiveness analysis spreadsheet.â (This blog post is old and not entirely reflective of our current approach, but it covers a similar topic.)
The numerical cost-effectiveness estimate in the spreadsheet is nearly always the most important factor in our recommendations, but not the only factor. That is, we donât solely rely on our spreadsheet-based analysis of cost-effectiveness when making grants.
We donât have an institutional position on exactly how much of the decision comes down to the spreadsheet analysis (though Elieâs take of â80% plusâ definitely seems reasonable!) and it varies by grant, but many of the factors we consider outside our models (e.g. qualitative factors about an organization) are in the service of making impact-oriented decisions. See this post for more discussion.
For a small number of grants, the case for the grant relies heavily on factors other than expected impact of that grant per se. For example, we sometimes make exit grants in order to be a responsible funder and treat partner organizations considerately even if we think funding could be used more cost-effectively elsewhere.
To add something to our top charities list (vs. make a grant from the All Grants Fund /â GHDF), we want a high degree of confidence in the program. See our list of additional criteria for top charities here; some of those criteria arenât proxies for cost-effectiveness, but are instead capturing whether a program provides the confidence and direct case for impact that donors expect from that product.
Also, we recognize it was confusing to have GiveDirectly on our top charity list when we believed our other top charities were substantially more cost-effective. Now, our list of top charities is limited to the programs that we think can most cost-effectively use marginal funding (currently, programs we believe to have room for more funding that is at least 10x unconditional cash transfers); see the fourth bullet point here.