Executive summary: In this introductory post for the Better Futures essay series, William MacAskill argues that future-oriented altruists should prioritize humanity’s potential to flourish—not just survive—since we are likely closer to securing survival than to achieving a truly valuable future, and the moral stakes of flourishing may be significantly greater.
Key points:
Two-factor model: The expected value of the future is the product of our probability of Surviving and the value of the future conditional on survival.
We’re closer to securing survival than flourishing: While extinction risk this century is estimated at 1–16%, the value we might achieve conditional on survival is likely only a small fraction of what’s possible.
Moral stakes favor flourishing: If we survive but achieve only 10% of the best feasible future’s value, the loss from non-flourishing could be 36 times greater than from extinction risk.
Neglectedness of flourishing: The latent human drive to survive receives far more societal and philanthropic attention than efforts to ensure long-term moral progress or meaningful flourishing
Tractability is a crux: While flourishing-focused work is less clearly tractable than survival-focused work, MacAskill believes this could change with sustained effort—much like AI safety and biosecurity did over the past decade.
Caution about utopianism: The series avoids prescribing a single ideal future and instead supports developing “viatopia”—a flexible, open-ended state from which humanity can continue making moral progress.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.
Executive summary: In this introductory post for the Better Futures essay series, William MacAskill argues that future-oriented altruists should prioritize humanity’s potential to flourish—not just survive—since we are likely closer to securing survival than to achieving a truly valuable future, and the moral stakes of flourishing may be significantly greater.
Key points:
Two-factor model: The expected value of the future is the product of our probability of Surviving and the value of the future conditional on survival.
We’re closer to securing survival than flourishing: While extinction risk this century is estimated at 1–16%, the value we might achieve conditional on survival is likely only a small fraction of what’s possible.
Moral stakes favor flourishing: If we survive but achieve only 10% of the best feasible future’s value, the loss from non-flourishing could be 36 times greater than from extinction risk.
Neglectedness of flourishing: The latent human drive to survive receives far more societal and philanthropic attention than efforts to ensure long-term moral progress or meaningful flourishing
Tractability is a crux: While flourishing-focused work is less clearly tractable than survival-focused work, MacAskill believes this could change with sustained effort—much like AI safety and biosecurity did over the past decade.
Caution about utopianism: The series avoids prescribing a single ideal future and instead supports developing “viatopia”—a flexible, open-ended state from which humanity can continue making moral progress.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.