This. I’m imagine some Abrodolph Lincoler-esque character—Abronard Willter maybe—putting me in a brazen bull and cooing ‘Don’t worry, this will all be over soon. I’m going to create 10billion more of you also in a brazen bull, so the fact that I continue to torture you personally will barely matter.’
Creating identical copies of people is not claimed to sum to less moral worth than one person. It’s claimed to sum to no more than one person. Torturing one person is still quite bad.
By inference, if you are one of those copies, the ‘moral worth’ of your own perceived torture will therefore be 1/10billionth of its normal level. So, selfishly, that’s a huge upside—I might selfishly prefer being one of 10 billion identical torturees as long as I uniquely get a nice back scratch afterwards, for e.g.
This. I’m imagine some Abrodolph Lincoler-esque character—Abronard Willter maybe—putting me in a brazen bull and cooing ‘Don’t worry, this will all be over soon. I’m going to create 10billion more of you also in a brazen bull, so the fact that I continue to torture you personally will barely matter.’
Creating identical copies of people is not claimed to sum to less moral worth than one person. It’s claimed to sum to no more than one person. Torturing one person is still quite bad.
By inference, if you are one of those copies, the ‘moral worth’ of your own perceived torture will therefore be 1/10billionth of its normal level. So, selfishly, that’s a huge upside—I might selfishly prefer being one of 10 billion identical torturees as long as I uniquely get a nice back scratch afterwards, for e.g.