Rethink Priorities’ Global Health and Development team is a multidisciplinary ten-person team conducting research around various global health, international development, and climate change topics. We have so far mostly done “shallow” style reports for Open Philanthropy, though we have also worked for other organizations, and have conducted some self-driven research. This post aims to share our current research process. The hope is to make our research as transparent as possible.
About the team
The Global Health and Development (GHD) team is one of the newer departments at Rethink Priorities (RP). It officially formed in Q3 2021, and throughout 2022 the team grew from the initial four hires to its current 10 members. Our team consists of two senior research managers (Tom Hird and Melanie Basnak) overseeing eight researchers of different seniority (Greer Gosnell, Aisling Leow, Jenny Kudymowa, Ruby Dickson, Bruce Tsai, Carmen van Schoubroeck, James Hu, and Erin Braid). GHD team members have expertise in economics, health, science, and policy, and bring experience from academia, consultancy, medicine, and nonprofit work.
Our past research reports
Rethink Priorities is a research organization that strives to generate impact by providing relevant stakeholders with tools to make more informed decisions. The GHD team’s work to date has mainly been commissioned by donors looking to have a positive impact. Since its inception, the team has completed 23 reports for five different organizations/individuals, as well as two self-driven reports. We have publicly published four of these reports:
Whenever possible, we want to disseminate our findings to maximize our impact. We intend to publish 13 of the remaining 19 reports that we have previously completed, but not yet shared publicly.[1] Going forward, we hope to be able to publish within three months of their completion.[2]
Most of our past reports (78%) have been commissioned by Open Philanthropy (OP). The projects we typically do for OP are “shallow” investigations looking into specific cause areas (e.g., hypertension, substandard and falsified drugs). These reports usually contain the following:
A basic introduction, especially for complex topics
An estimate of the burden for the specific problematic area (and potentially the impact one could have by focusing on that area)
This process usually involves critically engaging with existing estimates, as well as making our own
The burden estimation is often the most important part of the reports
Information about existing funding going into the area
An analysis of potential interventions to tackle the issue, which often includes:
Identifying potential interventions across different areas (e.g., policy, advocacy, market shaping, direct provision)
Evaluating potential interventions with a view to tractability and cost-effectiveness
A discussion of the main uncertainties about the area and/or the existing interventions
We have also done different types of work (for OP and others), including red-teaming (providing an outside skeptical challenge to existing work/ideas), investigating specific uncertainties around a topic following a previous report on it, and exploratory/strategy reports on relevant research within the effective altruism (EA) space.
Our research process
Our workflow
Most of our projects involve collaboration across two to three researchers of different seniority. We typically ensure that there is one senior researcher per project to act as “project lead,” making most of the coordination efforts and ensuring, along with the manager, that the project is on track.[3]
Our commissioned projects usually kick off with a brief from the client that contains research questions that guide and structure our research. For internal projects (and some commissioned projects), the managers put together the briefs.
Most of our research projects, regardless of their nature or topic, involve the following components:
Desk research. We generally begin by searching for appropriate literature to answer the questions at hand. We assess the evidence (e.g., the number of quality studies in support of each idea, and their generalizability to the context of interest) and identify our uncertainties based on gaps in the literature.
Expert interviews. We interview experts on the topics we research. We are a team of generalists, and as such, we remain humble about our limited expertise in a lot of the areas we research and seek to synthesize expert opinions. Experts include, but are not limited to, academics, CEOs, practitioners, and government officials. When possible, particularly when a topic is polarizing, we interview experts with (very) different perspectives (e.g., for our lead paint report). We find these experts through a combination of recommendations from clients, connections from our own networks, and cold messaging relevant people identified through desk research.
Quantitative analyses. We often do quantitative analyses to estimate how cost-effective an intervention might be, how many lives it may have saved to date or save in the future, and the like. These vary in complexity, from very rough back-of-the-envelope calculations based mostly on assumptions, to more complex cost-effectiveness analyses (CEAs) drawing from a mix of data and assumptions (e.g., see the Spark CEA in our livelihoods report). We often use Excel or Google Sheets, but will on occasion use Causal or Guesstimate, depending on the client’s preferences and the project’s needs.
The amount of time spent on a given project depends on features like its scope and the number of researchers involved. The average project has involved about 60% of two full-time researchers’ time over the course of five weeks, though some projects have taken just one to two weeks.
Our reports undergo several rounds of internal review. During these periods (often in the middle and at the end of each project), the manager overseeing that project will thoroughly review drafts. Often, the other manager (and sometimes a researcher not involved in that project) will also act as a reviewer. Reviews have usually taken place ~two days before the draft or final report was due to be completed, allowing some time for the researchers to address standing comments, doubts, or concerns. In the context of commissioned research, we send this version of the report to the client.
We then spend some extra time finalizing and polishing the report for publication. This step involves checking for consistent formatting, reaching out to experts to ensure their views are represented accurately and securing permission to quote them publicly, adding an editorial note and an acknowledgments section, and conducting a final (and particularly thorough) round of internal review.
The timeline of a typical project
Next is an example timeline for a typical project to date:
Week 1:
Engage with the project brief, identifying potential “cruxes” in the research, and trying to define the scope as thoroughly as possible
Kickoff meeting with the client, where we raise questions that arose from engaging with the brief and discuss logistics
“Premortem”: a process in which we try to identify the main difficulties of completing this project and define action items to ensure we can overcome them
Team meeting to divide and coordinate the work
Initial research, getting familiar with the topic
Identifying and reaching out to experts (sometimes it takes a while for experts to get back to us, so we try to do this task as soon as possible; over the course of the project we might reach out to additional experts)
Rough “initial takes” shared with client
Weeks 2-3:
Desk research
Expert interviews
Sometimes generate quantitative models, though this often takes place later in the project
First draft: internal review, send to client, debrief meeting with client to get feedback and discuss next steps
Weeks 4-5:
Desk research
Sometimes more expert interviews
Generate quantitative models
Write a section on remaining uncertainties, and sometimes a section on “what we would do with more time”
Write executive summary
Final draft: internal review, send to client, debrief meeting with client to receive and give feedback
Week 6+:
“Retrospective”: a process in which we discuss what worked and what didn’t when conducting this project, and distill learnings for future projects
Sometimes we are asked to do a few more hours of work to answer a key question that arose from our research; we usually follow up on those requests right after the project is completed
Polish the report for publication
Throughout the course of the project, we have recurring team meetings to discuss progress, and we may reach out to the client via email or have weekly check-in calls with them to ensure short feedback loops.
Some general principles
Across topics and project types, there are some underlying principles that remain constant:
Reasoning transparency. We try to make our reasoning as transparent as possible, specifying how all sources of information included in the report contribute to our conclusions, stating our certainty levels around different claims, and pointing out major sources of uncertainty in our analyses.[4]
Intellectual honesty/humility. Our team comprises diverse experience (academia, consulting, nonprofits) and areas of expertise (medicine, biology, climate change, economics, quantitative sciences). That being said, we view ourselves as generalists and are not usually experts in the specific topics we research. Additionally, most of our reports are carried out in a limited time frame. Thus, while we strive for rigor in our research, we recognize that our findings may not reflect the absolute truth, and we are always open and willing to review our conclusions in light of new information.
Collaboration. We think there is strength in collaboration, both within RP and across value-aligned organizations. We have started conversations with other researchers in the GHD and climate spaces and are always keen to share our unpublished reports (and any other resources that could be useful) with them. We strive to be kind and respectful in all of our interactions with external researchers and stakeholders.
Future developments
Our research process has been evolving and will continue to do so. To ensure our research continually improves in rigor and thoroughness, we periodically revisit our processes. As our emphasis shifts toward internally driven research, the features and format of our reports and methodological approaches could also change.
We aim to incorporate relevant aspects (e.g., assumptions, moral weights) of research outputs from other organizations if we think they are well supported and will improve the conclusions of our reports.
We have begun to assemble guides related to some of our primary research components. For example, we are currently working on a cost-effectiveness analysis guide to converge on a more unified and replicable framework. In the spirit of transparency and collaboration, we hope to eventually make our internal guide publicly available.
We mentioned above that our reports go through several rounds of internal review. We would like to encourage and participate in external review processes in the future, for instance among researchers in other global health, development and climate organizations and/or from academics with relevant expertise. We imagine this being a collaborative endeavor, where other researchers review some of our work, and we review some of theirs.
Contributions and acknowledgments
This post was written by Melanie Basnak with feedback from the full GHD team. We would like to thank Adam Papineau for copyediting and Rachel Norman for reviewing the post and providing useful suggestions. If you are interested in Rethink Priorities’ work, you can sign up for our newsletter. We use it to keep our readers updated about new research posts and other resources.
Some of our reports cannot be published because we have not secured permission from our clients to do so, and there are good reasons to withhold some of them. Other reports are very niche and we do not think there would be a lot of value in publishing them, so the trade-off between time invested in preparing them for publication and the value readers might get out of them is not enough to compel us to publish them.
Our publication process has been delayed in the past due to the limited size of our team, with researchers spending most of their time tackling new projects as soon as previous projects were completed. With more staff, we are now making progress to shorten the window between project completion and publication.
This is not always the case. Three projects to date have been carried out by a single researcher, and four were completed without a senior researcher on board.
Our research process: an overview from Rethink Priorities’ Global Health and Development team
Summary
Rethink Priorities’ Global Health and Development team is a multidisciplinary ten-person team conducting research around various global health, international development, and climate change topics. We have so far mostly done “shallow” style reports for Open Philanthropy, though we have also worked for other organizations, and have conducted some self-driven research. This post aims to share our current research process. The hope is to make our research as transparent as possible.
About the team
The Global Health and Development (GHD) team is one of the newer departments at Rethink Priorities (RP). It officially formed in Q3 2021, and throughout 2022 the team grew from the initial four hires to its current 10 members. Our team consists of two senior research managers (Tom Hird and Melanie Basnak) overseeing eight researchers of different seniority (Greer Gosnell, Aisling Leow, Jenny Kudymowa, Ruby Dickson, Bruce Tsai, Carmen van Schoubroeck, James Hu, and Erin Braid). GHD team members have expertise in economics, health, science, and policy, and bring experience from academia, consultancy, medicine, and nonprofit work.
Our past research reports
Rethink Priorities is a research organization that strives to generate impact by providing relevant stakeholders with tools to make more informed decisions. The GHD team’s work to date has mainly been commissioned by donors looking to have a positive impact. Since its inception, the team has completed 23 reports for five different organizations/individuals, as well as two self-driven reports. We have publicly published four of these reports:
How effective are prizes at spurring innovation?
Livelihood interventions: overview, evaluation, and cost-effectiveness
The REDD+ framework for reducing deforestation and mitigating climate change: overview, evaluation, and cost-effectiveness
Exposure to Lead Paint in Low- and Middle-Income Countries.
Whenever possible, we want to disseminate our findings to maximize our impact. We intend to publish 13 of the remaining 19 reports that we have previously completed, but not yet shared publicly.[1] Going forward, we hope to be able to publish within three months of their completion.[2]
Most of our past reports (78%) have been commissioned by Open Philanthropy (OP). The projects we typically do for OP are “shallow” investigations looking into specific cause areas (e.g., hypertension, substandard and falsified drugs). These reports usually contain the following:
A basic introduction, especially for complex topics
An estimate of the burden for the specific problematic area (and potentially the impact one could have by focusing on that area)
This process usually involves critically engaging with existing estimates, as well as making our own
The burden estimation is often the most important part of the reports
Information about existing funding going into the area
An analysis of potential interventions to tackle the issue, which often includes:
Identifying potential interventions across different areas (e.g., policy, advocacy, market shaping, direct provision)
Evaluating potential interventions with a view to tractability and cost-effectiveness
A discussion of the main uncertainties about the area and/or the existing interventions
We have also done different types of work (for OP and others), including red-teaming (providing an outside skeptical challenge to existing work/ideas), investigating specific uncertainties around a topic following a previous report on it, and exploratory/strategy reports on relevant research within the effective altruism (EA) space.
Our research process
Our workflow
Most of our projects involve collaboration across two to three researchers of different seniority. We typically ensure that there is one senior researcher per project to act as “project lead,” making most of the coordination efforts and ensuring, along with the manager, that the project is on track.[3]
Our commissioned projects usually kick off with a brief from the client that contains research questions that guide and structure our research. For internal projects (and some commissioned projects), the managers put together the briefs.
Most of our research projects, regardless of their nature or topic, involve the following components:
Desk research. We generally begin by searching for appropriate literature to answer the questions at hand. We assess the evidence (e.g., the number of quality studies in support of each idea, and their generalizability to the context of interest) and identify our uncertainties based on gaps in the literature.
Expert interviews. We interview experts on the topics we research. We are a team of generalists, and as such, we remain humble about our limited expertise in a lot of the areas we research and seek to synthesize expert opinions. Experts include, but are not limited to, academics, CEOs, practitioners, and government officials. When possible, particularly when a topic is polarizing, we interview experts with (very) different perspectives (e.g., for our lead paint report). We find these experts through a combination of recommendations from clients, connections from our own networks, and cold messaging relevant people identified through desk research.
Quantitative analyses. We often do quantitative analyses to estimate how cost-effective an intervention might be, how many lives it may have saved to date or save in the future, and the like. These vary in complexity, from very rough back-of-the-envelope calculations based mostly on assumptions, to more complex cost-effectiveness analyses (CEAs) drawing from a mix of data and assumptions (e.g., see the Spark CEA in our livelihoods report). We often use Excel or Google Sheets, but will on occasion use Causal or Guesstimate, depending on the client’s preferences and the project’s needs.
The amount of time spent on a given project depends on features like its scope and the number of researchers involved. The average project has involved about 60% of two full-time researchers’ time over the course of five weeks, though some projects have taken just one to two weeks.
Our reports undergo several rounds of internal review. During these periods (often in the middle and at the end of each project), the manager overseeing that project will thoroughly review drafts. Often, the other manager (and sometimes a researcher not involved in that project) will also act as a reviewer. Reviews have usually taken place ~two days before the draft or final report was due to be completed, allowing some time for the researchers to address standing comments, doubts, or concerns. In the context of commissioned research, we send this version of the report to the client.
We then spend some extra time finalizing and polishing the report for publication. This step involves checking for consistent formatting, reaching out to experts to ensure their views are represented accurately and securing permission to quote them publicly, adding an editorial note and an acknowledgments section, and conducting a final (and particularly thorough) round of internal review.
The timeline of a typical project
Next is an example timeline for a typical project to date:
Week 1:
Engage with the project brief, identifying potential “cruxes” in the research, and trying to define the scope as thoroughly as possible
Kickoff meeting with the client, where we raise questions that arose from engaging with the brief and discuss logistics
“Premortem”: a process in which we try to identify the main difficulties of completing this project and define action items to ensure we can overcome them
Team meeting to divide and coordinate the work
Initial research, getting familiar with the topic
Identifying and reaching out to experts (sometimes it takes a while for experts to get back to us, so we try to do this task as soon as possible; over the course of the project we might reach out to additional experts)
Rough “initial takes” shared with client
Weeks 2-3:
Desk research
Expert interviews
Sometimes generate quantitative models, though this often takes place later in the project
First draft: internal review, send to client, debrief meeting with client to get feedback and discuss next steps
Weeks 4-5:
Desk research
Sometimes more expert interviews
Generate quantitative models
Write a section on remaining uncertainties, and sometimes a section on “what we would do with more time”
Write executive summary
Final draft: internal review, send to client, debrief meeting with client to receive and give feedback
Week 6+:
“Retrospective”: a process in which we discuss what worked and what didn’t when conducting this project, and distill learnings for future projects
Sometimes we are asked to do a few more hours of work to answer a key question that arose from our research; we usually follow up on those requests right after the project is completed
Polish the report for publication
Throughout the course of the project, we have recurring team meetings to discuss progress, and we may reach out to the client via email or have weekly check-in calls with them to ensure short feedback loops.
Some general principles
Across topics and project types, there are some underlying principles that remain constant:
Reasoning transparency. We try to make our reasoning as transparent as possible, specifying how all sources of information included in the report contribute to our conclusions, stating our certainty levels around different claims, and pointing out major sources of uncertainty in our analyses.[4]
Intellectual honesty/humility. Our team comprises diverse experience (academia, consulting, nonprofits) and areas of expertise (medicine, biology, climate change, economics, quantitative sciences). That being said, we view ourselves as generalists and are not usually experts in the specific topics we research. Additionally, most of our reports are carried out in a limited time frame. Thus, while we strive for rigor in our research, we recognize that our findings may not reflect the absolute truth, and we are always open and willing to review our conclusions in light of new information.
Collaboration. We think there is strength in collaboration, both within RP and across value-aligned organizations. We have started conversations with other researchers in the GHD and climate spaces and are always keen to share our unpublished reports (and any other resources that could be useful) with them. We strive to be kind and respectful in all of our interactions with external researchers and stakeholders.
Future developments
Our research process has been evolving and will continue to do so. To ensure our research continually improves in rigor and thoroughness, we periodically revisit our processes. As our emphasis shifts toward internally driven research, the features and format of our reports and methodological approaches could also change.
We aim to incorporate relevant aspects (e.g., assumptions, moral weights) of research outputs from other organizations if we think they are well supported and will improve the conclusions of our reports.
We have begun to assemble guides related to some of our primary research components. For example, we are currently working on a cost-effectiveness analysis guide to converge on a more unified and replicable framework. In the spirit of transparency and collaboration, we hope to eventually make our internal guide publicly available.
We mentioned above that our reports go through several rounds of internal review. We would like to encourage and participate in external review processes in the future, for instance among researchers in other global health, development and climate organizations and/or from academics with relevant expertise. We imagine this being a collaborative endeavor, where other researchers review some of our work, and we review some of theirs.
Contributions and acknowledgments
This post was written by Melanie Basnak with feedback from the full GHD team. We would like to thank Adam Papineau for copyediting and Rachel Norman for reviewing the post and providing useful suggestions. If you are interested in Rethink Priorities’ work, you can sign up for our newsletter. We use it to keep our readers updated about new research posts and other resources.
Some of our reports cannot be published because we have not secured permission from our clients to do so, and there are good reasons to withhold some of them. Other reports are very niche and we do not think there would be a lot of value in publishing them, so the trade-off between time invested in preparing them for publication and the value readers might get out of them is not enough to compel us to publish them.
Our publication process has been delayed in the past due to the limited size of our team, with researchers spending most of their time tackling new projects as soon as previous projects were completed. With more staff, we are now making progress to shorten the window between project completion and publication.
This is not always the case. Three projects to date have been carried out by a single researcher, and four were completed without a senior researcher on board.
For more on reasoning transparency, see this research report by Luke Muehlhauser of OP.