Hit Based Giving for Global Development


There is a lack of ad­vice in EA for in­di­vi­d­u­als on where to donate if they value global de­vel­op­ment and are okay with high risk, high re­ward in­ter­ven­tions—hit based giv­ing.

This gap could be closed by us­ing the EA fund for global de­vel­op­ment. This could also lead to in­creased fund­ing for more ex­per­i­men­tal, ne­glected or­gani­sa­tions and in­ter­ven­tions that are less likely to re­ceive in­sti­tu­tional money.


There may be a miss­ing gap when it comes to effec­tive al­tru­ism and sys­temic change in global de­vel­op­ment. This post from 80,000 Hours in 2015 - “Effec­tive al­tru­ists love sys­temic change”—high­lights many ways in which effec­tive al­tru­ism can be used to pur­sue le­gal, cul­tural and poli­ti­cal changes. But it seems that when it comes to in­di­vi­d­u­als donat­ing, who value global de­vel­op­ment, most ad­vice is still to give via char­i­ties that have ev­i­dence mainly based on ran­domised con­trol tri­als rather than higher risk, higher re­ward op­tions.

When look­ing at other cause ar­eas tra­di­tion­ally sup­ported within effec­tive al­tru­ism, these kind of in­ter­ven­tions are of­ten seen as worth fund­ing. ACE recom­mends many sys­tem chang­ing and re­search or­gani­sa­tions. This AI al­ign­ment char­ity com­par­i­son post high­lights or­gani­sa­tions work­ing on re­search and lob­by­ing. Open Philan­thropy has ex­panded their giv­ing—but fo­cused on U.S. policy, sci­ence fund­ing and his­tory of philan­thropy re­search.

For in­di­vi­d­u­als who want to sup­port po­ten­tially more im­pact­ful or­gani­sa­tions there isn’t much ded­i­cated re­search out there.


Givewell does look at qual­i­ta­tive info, but rep­u­ta­tion­ally are look­ing for harder ev­i­dence, so may not want to ex­pand their crite­ria. Th­ese two quotes be­low high­light their po­si­tion.

“We have strict crite­ria about the sorts of char­i­ties we recom­mend. Th­ese crite­ria are partly about achiev­ing max­i­mum im­pact, but partly about hav­ing recom­men­da­tions that oth­ers can fairly eas­ily be con­fi­dent in...Thus, we think there may be many giv­ing op­por­tu­ni­ties that are bet­ter than our top char­i­ties but don’t meet our crite­ria and/​or are not known to us.”

Root-causes-based ap­proaches are, in our view, the kind of spec­u­la­tive and long-term un­der­tak­ings that are best suited to highly en­gaged donors”

New Approach

There is start­ing to be a shift in this area though. Founders Pledge have a new re­port on ev­i­dence based policy. In Jan­uary 2019, the EA fund for global de­vel­op­ment gave $1,000,000 to J-PAL’s In­no­va­tion in Govern­ment Ini­ti­a­tive.

“[IGI] plans to make grants to part­ner­ships be­tween gov­ern­ments, J-PAL offices, and af­fili­ated re­searchers to help pi­lot and scale ev­i­dence-in­formed pro­grams in ed­u­ca­tion, health, and so­cial as­sis­tance.”

There is also a chance that dona­tions from peo­ple in­ter­ested in effec­tive al­tru­ism can be given to more ex­per­i­men­tal and/​or newer or­gani­sa­tions and in­ter­ven­tions.

This might be the main ad­van­tage of EA dona­tions in this space, will­ing­ness to fund hard or im­pos­si­ble to mea­sure in­ter­ven­tions that might be ne­glected by larger fun­ders. This could be donat­ing to meta or­gani­sa­tions like GiveWell or AidGrade. It could be dona­tions for or­gani­sa­tions work­ing di­rectly on lob­by­ing, re­search, ac­countabil­ity check­ing, ad­vo­cacy, im­prov­ing jour­nal­ism, data col­lec­tion, etc. It could also fund think tanks, such as CGD, JPAL.

Pos­si­ble Next Steps

If there is pos­i­tive feed­back about the re­cent dona­tion choice by the EA fund for global de­vel­op­ment it may make more sense to use that fund to donate to or­gani­sa­tions in a more port­fo­lio based ap­proach similar to the EA fund for an­i­mal welfare. - “The fund [an­i­mal welfare] could sup­port newer but still promis­ing or­ga­ni­za­tions with less ev­i­dence to sup­port them or or­ga­ni­za­tions with smaller fund­ing gaps”.

If there is nega­tive feed­back about the di­rec­tion the EA fund for global de­vel­op­ment has taken with it’s lat­est dona­tion there could be a new fund, man­aged by a team of peo­ple who have a va­ri­ety of back­grounds in global de­vel­op­ment. This al­lows donors to give to the fund that matches their risk ap­petite.

A third but less likely op­tion is set­ting up an or­gani­sa­tion similar to GiveWell or ACE to re­search these kinds of in­ter­ven­tions from the point of view of in­di­vi­d­ual donors.