Thanks for posting this. I think explicitly asking for critical feedback is very useful.
If the intervention is not currently supported by a large body of research then we want to fund/carry out a randomized controlled trial to test whether it’s worth pursuing this intervention.
RCTs are seriously expensive, would take years to get meaningful data, would need to be replicated as well before you could put much faith in it, and it wouldn’t align with the core skillset I’d imagine you’d need to be starting an organisation (so you’d need to outsource it, which would increase the costs even more). As Ryan said, it might be more useful to useful to aim to be recommended by OPP, or search for another kind of EA market inefficiency. Your other ideas of finding supportable but neglected interventions and doing them sounds pretty useful though.
Thank you for pointing this out. I had expected them to be a lot cheaper.
If GiveWell, as Ben said, has decided against funding RCTs, I’m not very likely to be convinced of their usefulness either.
But all those costs of RCTs are clearly worth it. Expensive? If your intervention is vaguely promising then EAs will throw enough money at you to get started. Time? Better get started now. Replication? More cost, EAs will fund. Outsource? Higher quality, EAs will fund.
I think GiveWell has considered funding RCTs for promising interventions and decided against it. They easily cost millions of dollars, take several years, and the evidence provided is often quite weak. Best to focus on the existing evidence-base from academia first, then move on to new RCTs when that’s all exhausted (ideally through partnerships with academics, which I think is what DMI did).
Thanks for posting this. I think explicitly asking for critical feedback is very useful.
RCTs are seriously expensive, would take years to get meaningful data, would need to be replicated as well before you could put much faith in it, and it wouldn’t align with the core skillset I’d imagine you’d need to be starting an organisation (so you’d need to outsource it, which would increase the costs even more). As Ryan said, it might be more useful to useful to aim to be recommended by OPP, or search for another kind of EA market inefficiency. Your other ideas of finding supportable but neglected interventions and doing them sounds pretty useful though.
Thank you for pointing this out. I had expected them to be a lot cheaper. If GiveWell, as Ben said, has decided against funding RCTs, I’m not very likely to be convinced of their usefulness either.
But all those costs of RCTs are clearly worth it. Expensive? If your intervention is vaguely promising then EAs will throw enough money at you to get started. Time? Better get started now. Replication? More cost, EAs will fund. Outsource? Higher quality, EAs will fund.
I think GiveWell has considered funding RCTs for promising interventions and decided against it. They easily cost millions of dollars, take several years, and the evidence provided is often quite weak. Best to focus on the existing evidence-base from academia first, then move on to new RCTs when that’s all exhausted (ideally through partnerships with academics, which I think is what DMI did).
What makes you think EAs would provide enough money? That would be excellent if so.