One way I’d push back is with a more human-centered lens: even if digital minds could vastly increase total utility, does that mean we should rush to replace ourselves?
There’s a difference between creating value and preserving something irreplaceable, like embodied experience, emotional depth, culture, and human vulnerability. If a moral theory says we should phase out humanity in favor of scalable minds, maybe that’s not a reason to obey it; it’s a reason to question its framing.
Thanks for raising this, Zeren!
One way I’d push back is with a more human-centered lens: even if digital minds could vastly increase total utility, does that mean we should rush to replace ourselves?
There’s a difference between creating value and preserving something irreplaceable, like embodied experience, emotional depth, culture, and human vulnerability. If a moral theory says we should phase out humanity in favor of scalable minds, maybe that’s not a reason to obey it; it’s a reason to question its framing.
Some things have value beyond aggregation.
-Astelle