It’s also good to have some independent evaluation and peer-review of Givewell’s research, even in the case of highly transparent organisations such as Givewell
In the past, where has this led? Has your peer review uncovered any errors GiveWell has made? Or has it been more conceptual disagreements like the one over GiveDirectly?
I don’t think there’s a clear cut-off between conceptual and empirical disagreements, and I think it’s really important for us to highlight agreement rather than disagreement. That’s partly because it’s easy to give yourself a pretext for not donating / not donating to charities you aren’t personally connected with, so we don’t want to give people the impression of argument where there isn’t any (think of people citing disagreements over vaccines causing autism, or over climate change, when really scientists all basically agree). And it’s partly because actually we do agree over all the fundamentals, and I’d really like to foster a friendly more collaborate atmosphere in effective altruism. For those reasons, while I’ll briefly mention a couple of examples where we’ve disagreed, please read them in the spirit of us being fundamentally very much on the same page and extremely grateful for the brilliant work GW do. These are simply my impression, and a reason we think our doing research is likely to have value despite GW’s excellent work. Times we’ve disagreed – GWWC didn’t recommend VillageReach (subsequently dropped by GW); GWWC continued recommending AMF over 2014 because we thought it was important AMF had continuity in donations in order to have leverage for making larger distribution agreements (GW has now stated they will recommend AMF for at least 2 years, for a similar reason); GWWC recommended both DWI and SCI earlier on than GW did.
When we report on our recommended charities such as the Against Malaria Foundation, we try to add to what Givewell has already researched and believe that they’ll hopefully take this into account in their future reports. For instance, here’s our new report on AMF:
where we cite research that hasn’t been taken into account by Givewell as of yet.
We’re also in contact with Givewell about their reports when we uncover errors (conceptual and factual). So far I’ve only had one email conversation with Givewell’s Jake Marcus about what I perceived as a misinterpretation about the decline in worm burden with age—but we ended up agreeing that we have different interpretation of the statistics.
In the past, where has this led? Has your peer review uncovered any errors GiveWell has made? Or has it been more conceptual disagreements like the one over GiveDirectly?
I don’t think there’s a clear cut-off between conceptual and empirical disagreements, and I think it’s really important for us to highlight agreement rather than disagreement. That’s partly because it’s easy to give yourself a pretext for not donating / not donating to charities you aren’t personally connected with, so we don’t want to give people the impression of argument where there isn’t any (think of people citing disagreements over vaccines causing autism, or over climate change, when really scientists all basically agree). And it’s partly because actually we do agree over all the fundamentals, and I’d really like to foster a friendly more collaborate atmosphere in effective altruism. For those reasons, while I’ll briefly mention a couple of examples where we’ve disagreed, please read them in the spirit of us being fundamentally very much on the same page and extremely grateful for the brilliant work GW do. These are simply my impression, and a reason we think our doing research is likely to have value despite GW’s excellent work. Times we’ve disagreed – GWWC didn’t recommend VillageReach (subsequently dropped by GW); GWWC continued recommending AMF over 2014 because we thought it was important AMF had continuity in donations in order to have leverage for making larger distribution agreements (GW has now stated they will recommend AMF for at least 2 years, for a similar reason); GWWC recommended both DWI and SCI earlier on than GW did.
When we report on our recommended charities such as the Against Malaria Foundation, we try to add to what Givewell has already researched and believe that they’ll hopefully take this into account in their future reports. For instance, here’s our new report on AMF:
https://drive.google.com/file/d/0B-ky1zIxhwx_QVBBb3ZuaVR5dEU/view?usp=sharing
a lot of the research cited in this report has not been taken into account by Givewell I believe.
Also look at our recent report on SCI here:
https://www.givingwhatwecan.org/blog/2015-03-31/charity-update-ii-schistosomiasis-control-initiative-sci
where we cite research that hasn’t been taken into account by Givewell as of yet.
We’re also in contact with Givewell about their reports when we uncover errors (conceptual and factual). So far I’ve only had one email conversation with Givewell’s Jake Marcus about what I perceived as a misinterpretation about the decline in worm burden with age—but we ended up agreeing that we have different interpretation of the statistics.