(I have not read all of your sequence.) I’m confused how being even close to 100% on something like this is appropriate, my sense is generally just that population ethics is hard, humans have somewhat weak minds in the space of possible minds, and our later post-human views on ethics might be far more subtle or quite different.
I’m a moral anti-realist (subjectivist), so I don’t think there’s an objective (stance-independent) fact of the matter. I’m just describing what I would expect to continue endorse under (idealized) reflection, which depends on my own moral intuitions. The asymmetry is one of my strongest moral intuitions, so I expect not to give it up, and if it conflicts with other intuitions of mine, I’d sooner give those up instead.
(I have not read all of your sequence.) I’m confused how being even close to 100% on something like this is appropriate, my sense is generally just that population ethics is hard, humans have somewhat weak minds in the space of possible minds, and our later post-human views on ethics might be far more subtle or quite different.
I’m a moral anti-realist (subjectivist), so I don’t think there’s an objective (stance-independent) fact of the matter. I’m just describing what I would expect to continue endorse under (idealized) reflection, which depends on my own moral intuitions. The asymmetry is one of my strongest moral intuitions, so I expect not to give it up, and if it conflicts with other intuitions of mine, I’d sooner give those up instead.