March 2 Update: We have a volunteer who is taking on this project. As a result, Joey and I broke down the project more to the following questions:
1.) What were the top twenty foreign aid foundations (including government agencies) from 1975 to 2000 in terms of total grant dollars given to foreign aid (e.g., DFID, USAID, Gates/GAVI)? Scoring them relative to each other, how would you score them on a 1-5 scale with 5 being most accurately described as “hits based” and 1 being most accurately described as “proven evidence-backed”? (Also, is this a useful dichotomy?) Please try to provide justification for rankings.
2a.) Looking back at the list of top twenty orgs by size, pick the top five orgs by size that are more “hits based” and the top five orgs by size that are more “evidence-backed”.
2b.) From each of these orgs, look at their top 10 grants by grant size. Of these, pick two grants that are likely to be the highest impact and two grants that are likely to be of average impact (relative to the ten grants from that org). You can look at there website, wiki page, and stated granting strategies to get a sense of this. (There will be 40 grants considered total.) Briefly describe the outcomes of the grant and the grant size. Present these grants shuffled and as blinded as possible (no org name) to Joey and me so that we can independently rank them without knowing whether they came from hits based orgs or not.
2c.) Using your own research, as best as possible, try to quantify the impact of these grants.
2d.) Combining our judgments, come to an overall assessment as best as possible as to the relative success of “hits-based” and “evidence-based” orgs.
We also have a bonus question that is much lower priority but might be of potential interest down the road:
3.) Can VC firms be described as pursuing a “hits based” strategy? How much due diligence do they put into their investments before making them? How does this due diligence compare to OpenPhil? Is there anything from learning about VC strategy we can use to inform EA strategy?
-
Joey and I separately estimated how long it would take to do (1) + (2). We then averaged our estimates together and then multiplied by 1.5 to adjust for the planning fallacy. We came up with a total of 70 hours. Since this is more than we originally thought, we decided to up our pay from $1500 to $2000.
I am sorry. It appears that a GuideStar Premium account is needed. (Or the questions will need to be changed—specifically the time period of the first question.) Or maybe there is a research tool/engine that I’m not aware of.
Anyone, please feel free to continue. Anyone can edit the document 100%.
(You will also be able to see other’s work in real time, which can always be reverted back.)
March 2 Update: We have a volunteer who is taking on this project. As a result, Joey and I broke down the project more to the following questions:
1.) What were the top twenty foreign aid foundations (including government agencies) from 1975 to 2000 in terms of total grant dollars given to foreign aid (e.g., DFID, USAID, Gates/GAVI)? Scoring them relative to each other, how would you score them on a 1-5 scale with 5 being most accurately described as “hits based” and 1 being most accurately described as “proven evidence-backed”? (Also, is this a useful dichotomy?) Please try to provide justification for rankings.
2a.) Looking back at the list of top twenty orgs by size, pick the top five orgs by size that are more “hits based” and the top five orgs by size that are more “evidence-backed”.
2b.) From each of these orgs, look at their top 10 grants by grant size. Of these, pick two grants that are likely to be the highest impact and two grants that are likely to be of average impact (relative to the ten grants from that org). You can look at there website, wiki page, and stated granting strategies to get a sense of this. (There will be 40 grants considered total.) Briefly describe the outcomes of the grant and the grant size. Present these grants shuffled and as blinded as possible (no org name) to Joey and me so that we can independently rank them without knowing whether they came from hits based orgs or not.
2c.) Using your own research, as best as possible, try to quantify the impact of these grants.
2d.) Combining our judgments, come to an overall assessment as best as possible as to the relative success of “hits-based” and “evidence-based” orgs.
We also have a bonus question that is much lower priority but might be of potential interest down the road:
3.) Can VC firms be described as pursuing a “hits based” strategy? How much due diligence do they put into their investments before making them? How does this due diligence compare to OpenPhil? Is there anything from learning about VC strategy we can use to inform EA strategy?
-
Joey and I separately estimated how long it would take to do (1) + (2). We then averaged our estimates together and then multiplied by 1.5 to adjust for the planning fallacy. We came up with a total of 70 hours. Since this is more than we originally thought, we decided to up our pay from $1500 to $2000.
I am sorry. It appears that a GuideStar Premium account is needed. (Or the questions will need to be changed—specifically the time period of the first question.) Or maybe there is a research tool/engine that I’m not aware of.
Anyway, here is a little bit of headway:
https://docs.google.com/document/d/1eAjrPIDINvE-g7bGP8b0hIDVFid36UrEE5nYZUkZ4q4/edit?usp=sharing
Anyone, please feel free to continue. Anyone can edit the document 100%. (You will also be able to see other’s work in real time, which can always be reverted back.)
Thanks! I can take it from here. :)
Congratulations! This is very exciting and I’m looking forward to hearing about future updates.