It looks like you’re subscribing to a person-affecting philosophy, whereby you say potential future humans aren’t worthy of moral consideration because they’re not being deprived, but bringing them into existence would be bad because they would (could) suffer.
I think this is arbitrarily asymmetrical, and not really compatible with a total utilitarian framework. I would suggest reading the relevant chapter in Nick Beckstead’s thesis ‘On the overwhelming importance of shaping the far future’, where I think he does a pretty good job at showing just this.
It looks like you’re subscribing to a person-affecting philosophy, whereby you say potential future humans aren’t worthy of moral consideration because they’re not being deprived, but bringing them into existence would be bad because they would (could) suffer.
I think this is arbitrarily asymmetrical, and not really compatible with a total utilitarian framework. I would suggest reading the relevant chapter in Nick Beckstead’s thesis ‘On the overwhelming importance of shaping the far future’, where I think he does a pretty good job at showing just this.