In longtermism, a trajectory change is a persistent change to total value at every point in the long-term future. Trajectory changes have also been described as slight or significant changes to “the world’s development trajectory, or just trajectory for short”, with that referring to:[1]
a rough summary way the future will unfold over time. The summary includes various facts about the world that matter from a macro perspective, such as how rich people are, what technologies are available, how happy people are, how developed our science and culture is along various dimensions, and how well things are going all-things-considered at different points of time. It may help to think of the trajectory as a collection of graphs, where each graph in the collection has time on the x-axis and one of these other variables on the y-axis.
[...] If we ever prevent an existential catastrophe, that would be an extreme example of a trajectory change. There may also be smaller trajectory changes. For example, if some species of dolphins that we really loved were destroyed, that would be a much smaller trajectory change.
Most longtermist interventions focus on trajectory changes, including but not limited to existential risk reduction. However, some longtermist interventions focus on alternative objectives, such as speeding up development.
Further reading
Forethought Foundation (2018) Longtermism: Potential research projects, Forethought Foundation, November.
Koehler, Arden, Benjamin Todd, Robert Wiblin & Keiran Harris (2020) Benjamin Todd on varieties of longtermism and things 80,000 Hours might be getting wrong, The 80,000 Hours Podcast, September.
Related entries
cultural persistence | existential risk | longtermism
- ^
Beckstead, Nick (2013) A proposed adjustment to the astronomical waste argument, Effective Altruism Forum, May 27.
Does existential risk reduction qualify as a trajectory change? It seems not, by the definition given:
On a related note, this definition of trajectory change and the usage Will MacAskill uses in What We Owe the Future (similar if not identical) are not what I intuitively assumed when I first heard the phrase “trajectory change.”
My layperson understanding was that if we change the trajectory of civilization in these next few decades, we may for example increase the probability of successfully navigating some critical period of high existential risk.
In WWOTF MacAskill says (IIRC) that it’s a positive trajectory change only if the total welfare per year is higher, but we could imagine that the change in the trajectory of civilization in the next few decades actually involves decreased welfare for people in the next few decades, but nevertheless prepares us better for existential risks, such that the long-term future is higher EV. I’d want to call this a positive trajectory change, but the definition given here and in WWOTF doesn’t want to call it that.