I don’t focus exclusively on philanthropic funding. I added these paragraphs to the post to clarify my position:
I agree that a full accounting of neglectedness should consider all resources going towards the cause (not just philanthropic ones), and that ‘preventing nuclear war’ more broadly receives significant attention from defence departments. However, even considering those resources, it still seems similarly neglected as biorisk.
And the amount of philanthropic funding still matters because certain important types of work in the space can only be funded by philanthropists (e.g. lobbying or other policy efforts you don’t want to originate within a certain national government).
I’d add that if if there’s almost no EA-inspired funding in a space, there’s likely to be some promising gaps by someone applying that mindset.
In general, it’s a useful approximation to think of neglectedness as a single number, but the ultimate goal is to find good grants, and to do that it’s also useful to break down neglectedness into different types of resources, and consider related heuristics (e.g. that there was a recent drop).
--
Causes vs. interventions more broadly is a big topic. The very short version is that I agree doing cost-effectiveness estimates of specific interventions is a useful input into cause selection. However, I also think the INT framework is very useful. One reason is it seems more robust. Another reason is that in many practical planning situations that involve accumulating expertise over years (e.g. choosing a career, building a large grantmaking programme) it seems better to focus on a broad cluster of related interventions.
E.g. you could do a cost-effectiveness estimate of corporate campaigns and determine ending factory farming is most cost-effective. But once you’ve spent 5 years building career capital in that factory farming, the available interventions or your calculations about them will likely very different.
I’d add that if if there’s almost no EA-inspired funding in a space, there’s likely to be some promising gaps by someone applying that mindset.
Agreed, although my understanding is that you think the gains are often exagerated. You said:
Overall, my guess is that, in an at least somewhat data-rich area, using data to identify the best interventions can perhaps boost your impact in the area by 3–10 times compared to picking randomly, depending on the quality of your data.
Again, if the gain is just a factor of 3 to 10, then it makes all sense to me to focus on cost-effectiveness analyses rather than funding.
In general, it’s a useful approximation to think of neglectedness as a single number, but the ultimate goal is to find good grants, and to do that it’s also useful to break down neglectedness into different types of resources, and consider related heuristics (e.g. that there was a recent drop).
Agreed. However, deciding how much to weight a given relative drop in a fraction of funding (e.g. philanthropic funding) requires understanding its cost-effectiveness relative to other sources of funding. In this case, it seems more helpful to assess the cost-effectiveness of e.g. doubling philanthropic nuclear risk reduction spending instead of just quantifying it.
Causes vs. interventions more broadly is a big topic. The very short version is that I agree doing cost-effectiveness estimates of specific interventions is a useful input into cause selection. However, I also think the INT framework is very useful. One reason is it seems more robust.
The product of the 3 factors in the importance, neglectedness and tractability framework is the cost-effectiveness of the area, so I think the increased robustness comes from considering many interventions. However, one could also (qualitatively or quantitatively) aggregate the cost-effectiveness of multiple (decently scalable) representative promising interventions to estimate the overall marginal cost-effectiveness (promisingness) of the area.
Another reason is that in many practical planning situations that involve accumulating expertise over years (e.g. choosing a career, building a large grantmaking programme) it seems better to focus on a broad cluster of related interventions.
I agree, but I did not mean to argue for deemphasising the concept of cause area. I just think the promisingness of areas had better be assessed by doing cost-effectiveness analyses of representative (decently scalable) promising interventions.
E.g. you could do a cost-effectiveness estimate of corporate campaigns and determine ending factory farming is most cost-effective.
To clarify, the estimate for the cost-effectiveness of corporate campaigns I shared above refers to marginal cost-effectiveness, so it does not directly refer to the cost-effectiveness of ending factory-farming (which is far from a marginal intervention).
But once you’ve spent 5 years building career capital in that factory farming, the available interventions or your calculations about them will likely very different.
My guess would be that the acquired career capital would still be quite useful in the context of the new top interventions, especially considering that welfare reforms have been top interventions for more than 5 years[1]. In addition, if Open Philanthropy is managing their funds well, (all things considered) marginal cost-effectiveness should not vary much across time. If the top interventions in 5 years were expected to be less cost-effective than the current top interventions, it would make sense to direct funds from the worst/later to the best/earlier years until marginal cost-effectiveness is equalised (in the same way that it makes sense to direct funds from the worst to best interventions in any given year).
Open Phil granted 1 M$ to The Humane League’s cage free campaigns in 2016, 7 years ago. Saulius Šimčikas’ analysis of corporate campaigns looks into ones which happened as early as 2005, 19 years ago.
I don’t focus exclusively on philanthropic funding. I added these paragraphs to the post to clarify my position:
I’d add that if if there’s almost no EA-inspired funding in a space, there’s likely to be some promising gaps by someone applying that mindset.
In general, it’s a useful approximation to think of neglectedness as a single number, but the ultimate goal is to find good grants, and to do that it’s also useful to break down neglectedness into different types of resources, and consider related heuristics (e.g. that there was a recent drop).
--
Causes vs. interventions more broadly is a big topic. The very short version is that I agree doing cost-effectiveness estimates of specific interventions is a useful input into cause selection. However, I also think the INT framework is very useful. One reason is it seems more robust. Another reason is that in many practical planning situations that involve accumulating expertise over years (e.g. choosing a career, building a large grantmaking programme) it seems better to focus on a broad cluster of related interventions.
E.g. you could do a cost-effectiveness estimate of corporate campaigns and determine ending factory farming is most cost-effective. But once you’ve spent 5 years building career capital in that factory farming, the available interventions or your calculations about them will likely very different.
Thanks for clarifying, Ben!
Agreed, although my understanding is that you think the gains are often exagerated. You said:
Again, if the gain is just a factor of 3 to 10, then it makes all sense to me to focus on cost-effectiveness analyses rather than funding.
Agreed. However, deciding how much to weight a given relative drop in a fraction of funding (e.g. philanthropic funding) requires understanding its cost-effectiveness relative to other sources of funding. In this case, it seems more helpful to assess the cost-effectiveness of e.g. doubling philanthropic nuclear risk reduction spending instead of just quantifying it.
The product of the 3 factors in the importance, neglectedness and tractability framework is the cost-effectiveness of the area, so I think the increased robustness comes from considering many interventions. However, one could also (qualitatively or quantitatively) aggregate the cost-effectiveness of multiple (decently scalable) representative promising interventions to estimate the overall marginal cost-effectiveness (promisingness) of the area.
I agree, but I did not mean to argue for deemphasising the concept of cause area. I just think the promisingness of areas had better be assessed by doing cost-effectiveness analyses of representative (decently scalable) promising interventions.
To clarify, the estimate for the cost-effectiveness of corporate campaigns I shared above refers to marginal cost-effectiveness, so it does not directly refer to the cost-effectiveness of ending factory-farming (which is far from a marginal intervention).
My guess would be that the acquired career capital would still be quite useful in the context of the new top interventions, especially considering that welfare reforms have been top interventions for more than 5 years[1]. In addition, if Open Philanthropy is managing their funds well, (all things considered) marginal cost-effectiveness should not vary much across time. If the top interventions in 5 years were expected to be less cost-effective than the current top interventions, it would make sense to direct funds from the worst/later to the best/earlier years until marginal cost-effectiveness is equalised (in the same way that it makes sense to direct funds from the worst to best interventions in any given year).
Open Phil granted 1 M$ to The Humane League’s cage free campaigns in 2016, 7 years ago. Saulius Šimčikas’ analysis of corporate campaigns looks into ones which happened as early as 2005, 19 years ago.