The Happier Lives Institute connects donors, researchers, and policymakers with the most cost-effective opportunities to increase global wellbeing.
Using the latest subjective wellbeing data, we identify the problems that matter most to people and find evidence-based ways to solve them.
Thank you for your comments, Gregory. We’re aware you have strong views on the subject and we appreciate your conscientious contributions. We discussed your previous comments internally but largely concluded revisions weren’t necessary as we (a) had already considered them in the report and appendix, (b) will return to them in later versions and didn’t expect they would materially affect the results, or (c) simply don’t agree with these views. To unpack:
Study quality. We conclude the data set does contain bias, but we account for it (sections 3.2 and 5; it’s an open question among academics how best to do this). We don’t believe that the entire field of LMIC psychotherapy should be considered bunk, compromised, or uninformative. Our results are in line with existing meta-analyses of psychotherapy considered to have low risk of bias (see footnote).[1]
Evidentiary standards. We drew on a large number of RCTs for our systematic reviews and meta-analyses of cash transfers and psychotherapy (42 and 74, respectively). If one holds that the evidence for something as well-studied as psychotherapy is too weak to justify any recommendations, charity evaluators could recommend very little.
Outlier exclusion. The issues regarding outlier exclusion were discussed in some depth (3.2 in the main report and in Appendix B). Excluding outliers is thought sensible practice here; two related meta-analyses, Cuijpers et al., 2020c; Tong et al., 2023, used a similar approach. It’s consistent with not taking the entire literature at face value but also not taking guilt by association too far. If one excludes outliers, the specific way one does this has a minor effect (e.g., a 10% decline in effectiveness, see appendix). Our analysis necessarily makes analytic choices: some were pre-registered, some made on reflection, many were discussed in our sensitivity analysis. If one insisted only on using charity evaluations that had every choice pre-registered, there would be none to choose from.
Bayesian analysis: The method we use (‘grid approximation’, see 8.3 and 9.3) avoids subjective inputs. It is not this Bayesian analysis itself that ‘stacks the deck’ in favour of psychotherapy, but the evidence. Given that over 70 studies form the prior, it would be surprising if adding one study, as we did for StrongMinds, would radically alter the conclusions. [Edit 5/12/2023: on the point that StrongMinds could be more cost-effective than GiveDirectly, even if StrongMinds only has the small effect we assume it does in our hypothetical placeholder studies, it doesn’t seem inconceivable that a small, less effective intervention can still be more cost-effective than a big, expensive one. For context, we estimate it costs StrongMinds $63 per intervention—providing one person with a course of therapy—whereas it costs GiveDirectly $1221 per intervention—an $1000 cash transfer which costs $221 in overheads. As the therapy is about 20x cheaper, it can be far less effective yet still more cost-effective.]
Making recommendations: we aim to recommend the most cost-effective ways of increasing WELLBYs we’ve found so far. While we have intuitions about how good different interventions are our perspective as an organisation is that conclusions about what’s cost-effective should be led heavily by the evidence rather than by our pre-evidential beliefs (‘priors’). Given the evidence we’ve considered, we don’t see a strong case for recommending cash transfers over psychotherapy.
This is a working report, and we’ll be reflecting on how to incorporate the above, similarly psychotherapy-sceptical perspectives, and other views in the process of preparing it for academic review. In the interests of transparency, we don’t plan to engage beyond our comments above so as to preserve team resources.
We find an initial effect is 0.70 SDs, reduced to 0.46 SDs after publication bias adjustments. Cuijpers et al. 2023 find an effect of psychotherapy of 0.49 SDs for studies with low risk of bias (RoB) in low, middle, and high income countries (comparisons = 218), which reduces to between 0.27 and 0.57 after publication adjustment. Tong et al. 2023 find an effect of 0.69 SDs for studies with low RoB in non-western countries (primarily low and middle income; comparisons = 36), which adjust to between 0.42 and 0.60 after publication correction. Hence, our initial and adjusted numbers are similar.