To clarify, I was defining the different forms of EA more along the lines of ‘how they evaluate impact’, rather than which specific projects they think are best.
Short-run focused EA focuses on evaluating short-run effects.
Long-run focused EA also tries to take account of long-run effects.
Extreme long-run EA combines a focus on long-run effects with other unintuitive positions such as a focus on specific xrisks. Moderate long-run EA doesn’t.
The point of moderate long-run EA is that it’s much less clear which interventions are best by these standards.
I wasn’t trying to say that moderate long-run EA should focus on promoting economic growth and building better institutions, just that these are valuable outcomes, and it’s pretty unclear that we should prefer malaria nets (which were mainly selected on the basis of short-run immediate impact) to other efforts to do good that are widely pursued by smart altruists outside of the EA community.
A moderate long-run EA could even think that malaria nets are the best thing (at least for money, if not human capital), but they’ll be more uncertain and give greater emphasis to the flow through effects.
Yes, moderate long-run EA is more uncertain and doesn’t have “fully formed” answers—but that’s the situation we’re actually in.
To clarify, I was defining the different forms of EA more along the lines of ‘how they evaluate impact’, rather than which specific projects they think are best.
Short-run focused EA focuses on evaluating short-run effects. Long-run focused EA also tries to take account of long-run effects.
Extreme long-run EA combines a focus on long-run effects with other unintuitive positions such as a focus on specific xrisks. Moderate long-run EA doesn’t.
The point of moderate long-run EA is that it’s much less clear which interventions are best by these standards.
I wasn’t trying to say that moderate long-run EA should focus on promoting economic growth and building better institutions, just that these are valuable outcomes, and it’s pretty unclear that we should prefer malaria nets (which were mainly selected on the basis of short-run immediate impact) to other efforts to do good that are widely pursued by smart altruists outside of the EA community.
A moderate long-run EA could even think that malaria nets are the best thing (at least for money, if not human capital), but they’ll be more uncertain and give greater emphasis to the flow through effects.
Yes, moderate long-run EA is more uncertain and doesn’t have “fully formed” answers—but that’s the situation we’re actually in.