Thanks MIchelle, great to hear about your continuing fantastic work!
“In research, our comparative advantage continues to be identifying crucial considerations”
feels a little tautological/vague to me. Is there a particular interpretation of crucial considerations you were going for? Am I reading it right in thinking that you’re key contribution here is challenging givewell’s methods and joining the dots a bit more?
Could you elaborate a bit more about GWWC’s comparative advantage in research? What is GWWC in a good position to do, and what could better be done by e.g. GiveWell or academic research?
One example of crucial considerations are disease interactions that might have the potential to significantly influence the cost-effectiveness analyses of charities. One such disease interaction is that of deworming with malaria, which is obviously really relevant and important given that we recommend both malaria and deworming charities. We’ve reviewed the literature on this interaction this year and it turns out that deworming for STH might have some protective effects against malaria and thus deworming might increase malaria. Givewell has picked up on this and cited our review in their latest review of deworming: http://www.givewell.org/international/technical/programs/deworming#header-3
even if this interaction didn’t turn out to be all too worrying, it could be a near miss (obviously if the effect size of this interaction would have been bigger, it would have been more likely to be on everyone’s radar, but still).
And yes, even though we think very highly of Givewell and their research output, we also think that it is good to have at least one other independent source in this space.
Academic research is not usually doing what we do. They are reviewing the theoretical cost-effectiveness of an intervention to inform policy of big organisations in development. We are trying to bridge the gap between the scientific literature’s theoretical cost-effectiveness estimates and the effectiveness of particular organisations. Also, the research field of cost-effectiveness research is still very young and even though some researchers are doing cost-effectiveness estimates in their particular field (e.g. estimating the cost-effectiveness of vaccines), there are few people who specialize in getting an overview of the different estimates and compare them. One exception is the DCP (DCP-3.org), but then again they are quite theoretical in the sense that their estimates are averages of big scale interventions that cater towards bigger organisations and health ministries.
Generally, even when discarding these considerations, there are good reasons for us having in-house expertise by research analysts: in order to communicate with our members and the general public professionally we need deep understanding of the topic for fact checking our materials and remain credible, which you can only get by doing some research on the topic ourselves.
This is what we write in our upcoming impact evaluation on our research:
“Since our members are now consistently donating millions of dollars to effective charities each year, it is crucial that we continue to increase our in-house expertise on charity effectiveness.
We must continually inform and fact-check our outreach and marketing, represent Giving
What We Can at scientific conferences and meetings, talk to other key players in the development sector on eye-level, and, most importantly, ensure that we always recommend the most effective charities to our members. It is vital for us to stay abreast of relevant findings coming from both academic and non-academic sources, and to communicate these findings to our audience in an accurate and accessible manner. We are planning on hiring for one more full time equivalent research position. There are three reasons for wanting to increase our research capacity.
First, due to increasing interest in effective altruism from the public, the media and potential members, we receive an increasing volume of questions about charity effectiveness, and these need to be answered swiftly and competently. There is also increasing demand for our researchers to give talks and answer questions on the results of their research; while this is excellent for our profile, it does place strain on our capacity. Secondly, as Giving What We Can grows and moves more money, our responsibility as stewards of donations becomes greater: we need to remain confident in the charities we recommend and scale up our research capacity accordingly. Granted that Givewell, another charity evaluator, has become increasingly professionalized, we still think that it is important to have at least one other organisation conducting research and keeping up with the literature on charity effectiveness. Finally, the community as a whole has blind spots on topics such as climate change, and it is imperative that we dedicate time to the issue. The distinctive feature of the effective altruism community is that we use evidence and analysis to come to decisions on where to donate; we cannot afford to leave serious gaps simply because of the time commitment required to look into them.”
One other benefit of our research is that we sometimes advise donors who are thinking about donating a substantial amount of money on where best to donate effectively. Sometimes these reports are tailor-made because donors have hard requirements and want to donate within a particular cause area. Often we think this is very effective use of our time, because we can influence a large amount of money.
Thanks MIchelle, great to hear about your continuing fantastic work!
“In research, our comparative advantage continues to be identifying crucial considerations”
feels a little tautological/vague to me. Is there a particular interpretation of crucial considerations you were going for? Am I reading it right in thinking that you’re key contribution here is challenging givewell’s methods and joining the dots a bit more?
Could you elaborate a bit more about GWWC’s comparative advantage in research? What is GWWC in a good position to do, and what could better be done by e.g. GiveWell or academic research?
Hi Tom and Imma,
thanks for the questions.
One example of crucial considerations are disease interactions that might have the potential to significantly influence the cost-effectiveness analyses of charities. One such disease interaction is that of deworming with malaria, which is obviously really relevant and important given that we recommend both malaria and deworming charities. We’ve reviewed the literature on this interaction this year and it turns out that deworming for STH might have some protective effects against malaria and thus deworming might increase malaria. Givewell has picked up on this and cited our review in their latest review of deworming: http://www.givewell.org/international/technical/programs/deworming#header-3
even if this interaction didn’t turn out to be all too worrying, it could be a near miss (obviously if the effect size of this interaction would have been bigger, it would have been more likely to be on everyone’s radar, but still).
And yes, even though we think very highly of Givewell and their research output, we also think that it is good to have at least one other independent source in this space.
Academic research is not usually doing what we do. They are reviewing the theoretical cost-effectiveness of an intervention to inform policy of big organisations in development. We are trying to bridge the gap between the scientific literature’s theoretical cost-effectiveness estimates and the effectiveness of particular organisations. Also, the research field of cost-effectiveness research is still very young and even though some researchers are doing cost-effectiveness estimates in their particular field (e.g. estimating the cost-effectiveness of vaccines), there are few people who specialize in getting an overview of the different estimates and compare them. One exception is the DCP (DCP-3.org), but then again they are quite theoretical in the sense that their estimates are averages of big scale interventions that cater towards bigger organisations and health ministries.
Generally, even when discarding these considerations, there are good reasons for us having in-house expertise by research analysts: in order to communicate with our members and the general public professionally we need deep understanding of the topic for fact checking our materials and remain credible, which you can only get by doing some research on the topic ourselves.
This is what we write in our upcoming impact evaluation on our research: “Since our members are now consistently donating millions of dollars to effective charities each year, it is crucial that we continue to increase our in-house expertise on charity effectiveness. We must continually inform and fact-check our outreach and marketing, represent Giving What We Can at scientific conferences and meetings, talk to other key players in the development sector on eye-level, and, most importantly, ensure that we always recommend the most effective charities to our members. It is vital for us to stay abreast of relevant findings coming from both academic and non-academic sources, and to communicate these findings to our audience in an accurate and accessible manner. We are planning on hiring for one more full time equivalent research position. There are three reasons for wanting to increase our research capacity. First, due to increasing interest in effective altruism from the public, the media and potential members, we receive an increasing volume of questions about charity effectiveness, and these need to be answered swiftly and competently. There is also increasing demand for our researchers to give talks and answer questions on the results of their research; while this is excellent for our profile, it does place strain on our capacity. Secondly, as Giving What We Can grows and moves more money, our responsibility as stewards of donations becomes greater: we need to remain confident in the charities we recommend and scale up our research capacity accordingly. Granted that Givewell, another charity evaluator, has become increasingly professionalized, we still think that it is important to have at least one other organisation conducting research and keeping up with the literature on charity effectiveness. Finally, the community as a whole has blind spots on topics such as climate change, and it is imperative that we dedicate time to the issue. The distinctive feature of the effective altruism community is that we use evidence and analysis to come to decisions on where to donate; we cannot afford to leave serious gaps simply because of the time commitment required to look into them.”
One other benefit of our research is that we sometimes advise donors who are thinking about donating a substantial amount of money on where best to donate effectively. Sometimes these reports are tailor-made because donors have hard requirements and want to donate within a particular cause area. Often we think this is very effective use of our time, because we can influence a large amount of money.