Creating identical copies of people is not claimed to sum to less moral worth than one person. It’s claimed to sum to no more than one person. Torturing one person is still quite bad.
By inference, if you are one of those copies, the ‘moral worth’ of your own perceived torture will therefore be 1/10billionth of its normal level. So, selfishly, that’s a huge upside—I might selfishly prefer being one of 10 billion identical torturees as long as I uniquely get a nice back scratch afterwards, for e.g.
Creating identical copies of people is not claimed to sum to less moral worth than one person. It’s claimed to sum to no more than one person. Torturing one person is still quite bad.
By inference, if you are one of those copies, the ‘moral worth’ of your own perceived torture will therefore be 1/10billionth of its normal level. So, selfishly, that’s a huge upside—I might selfishly prefer being one of 10 billion identical torturees as long as I uniquely get a nice back scratch afterwards, for e.g.