And even if you assume consequentialism to be true and set moral uncertainty aside, I believe this is the sort of thing where the empirical uncertainty is so deep, and the potential for profound harm so great, that we should seriously err on the side of not doing things that intuitively seem terribly wrong, since commonsense morality is a decent (if not perfect) starting point for determining the net consequences of actions. Not sure I’m making this point very clearly, but the general reasoning is discussed in this essay: Ethical Injunctions.
More generally I would say that – with all due respect to OP – this is an example of a risk associated with longtermist reasoning, whereby terrible things can seem alluring when astronomical stakes are involved. I think we, as a community, should be extremely careful about that.
Strongly agree with alexrjl here.
And even if you assume consequentialism to be true and set moral uncertainty aside, I believe this is the sort of thing where the empirical uncertainty is so deep, and the potential for profound harm so great, that we should seriously err on the side of not doing things that intuitively seem terribly wrong, since commonsense morality is a decent (if not perfect) starting point for determining the net consequences of actions. Not sure I’m making this point very clearly, but the general reasoning is discussed in this essay: Ethical Injunctions.
More generally I would say that – with all due respect to OP – this is an example of a risk associated with longtermist reasoning, whereby terrible things can seem alluring when astronomical stakes are involved. I think we, as a community, should be extremely careful about that.