The Three Peaks of the Long-Term Future
Any realistic distribution of the impact of a long-termist intervention has three peaks. One in the extreme positive, one in the extreme negative and a narrow needle-like peak centered at zero.
A successful long-termist intervention does some quantity of good for a really large number of people/sentient beings. You might model it’s expected utility as some kind of bell-shaped distribution centered somewhere in the extreme positive.
But what if things go wrong? What if you preserve a civilization not worth preserving (e.g. factory farming goes to the stars)? What if you call attention to a potential weapon by agitating about x-risk and, in doing so, destroy civilization? The probability of these scenarios is not literally zero, so there is a second peak centered somewhere in the extreme negative. These two peaks are not necessarily equally large or equally far away from zero.
A third peak is at zero. There are attractors. Things that make slightly different states of the world converge on the same. For example, you might found an organization doing outstanding work on moving us towards the glorious post-speciesist vegan future. There are positive feedback loops and threshold effects involved in such an endeavor, success seems out of reach until it is suddenly there. Thanks to you instead of being 5% there we are 20% there when a global dictatorship takes over. It cracks down on all political activism, not because it disagrees with what you’re doing on the object-level, but because you are tacitly questioning it’s authority and that’s a no-no. Impact: approximately zero. For any long-termist intervention you will be able to come up with a scenario like this, where your progress doesn’t matter. It will have a non-zero probability.
This model should give us pause and dampen our enthusiasm for long-term interventions. The expected utility distributions of long-term interventions become a lot more like those of short-term interventions, especially once we factor in the flow-through effects the latter might have (giving them three peaks also).
Hi there,
Relatedly, see Counterproductive Altruism: The Other Heavy Tail.