Executive summary: Moral error—where future beings endorse a suboptimal civilization—poses a significant existential risk by potentially causing the loss of most possible value, even if society appears functional and accepted by its inhabitants.
Key points:
Definition of moral error and mistopia – Moral error occurs when future beings accept a society that is vastly less valuable than what could have been. Mistopia is a society that, while not necessarily worse than nothing, is a small fraction as good as it could be.
Sources of moral error – Potential errors arise from population ethics, theories of well-being, the moral status of digital beings, and trade-offs between happiness and suffering, among others. Mistakes in these areas could lead to a civilization that loses most of its potential value.
Examples of moral errors – These include prioritizing happiness machines over autonomy, favoring short-lived beings over long-lived ones, failing to properly account for digital beings’ moral status, and choosing homogeneity over diversity.
Meta-ethical risks – A civilization could make errors in deciding whether to encourage value change or stasis, leading to either unreflective moral stagnation or uncontrolled value drift.
Empirical mistakes – Beyond philosophical errors, incorrect factual beliefs (e.g., mistakenly believing interstellar expansion is impossible) could also result in moral errors with large consequences.
Moral progress challenges – Unlike past moral progress driven by the advocacy of the disenfranchised, many future moral dilemmas involve beings (e.g., digital entities) who cannot advocate for themselves, making it harder to avoid moral error.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.
Executive summary: Moral error—where future beings endorse a suboptimal civilization—poses a significant existential risk by potentially causing the loss of most possible value, even if society appears functional and accepted by its inhabitants.
Key points:
Definition of moral error and mistopia – Moral error occurs when future beings accept a society that is vastly less valuable than what could have been. Mistopia is a society that, while not necessarily worse than nothing, is a small fraction as good as it could be.
Sources of moral error – Potential errors arise from population ethics, theories of well-being, the moral status of digital beings, and trade-offs between happiness and suffering, among others. Mistakes in these areas could lead to a civilization that loses most of its potential value.
Examples of moral errors – These include prioritizing happiness machines over autonomy, favoring short-lived beings over long-lived ones, failing to properly account for digital beings’ moral status, and choosing homogeneity over diversity.
Meta-ethical risks – A civilization could make errors in deciding whether to encourage value change or stasis, leading to either unreflective moral stagnation or uncontrolled value drift.
Empirical mistakes – Beyond philosophical errors, incorrect factual beliefs (e.g., mistakenly believing interstellar expansion is impossible) could also result in moral errors with large consequences.
Moral progress challenges – Unlike past moral progress driven by the advocacy of the disenfranchised, many future moral dilemmas involve beings (e.g., digital entities) who cannot advocate for themselves, making it harder to avoid moral error.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.