Thanks, yeah that’s definitely worth addressing. I was implicitly thinking that strict replaceability was the philosophically interesting/objectionable claim. The mere possibility of high-stakes swamping seems a bit more generic, and less distinctive to longtermism. E.g. neartermists may be equally committed to killing (or failing to save) one innocent in order to save a sufficiently large number of other, already-existing people. In general, not wanting to be sacrificed isn’t a good reason to deny that others have value at all. But yeah, worth mentioning this in the paper itself.
My sense is that many people will think killing for replacement is distinctively objectionable, however many people are being added and however good their lives are, even though they accept that in extreme cases its okay to kill one to save very many who already exist. To capture that intuition, you need more than just that you should prioritize current people’s lives a lot, the priority has to be infinite.
Thanks, yeah that’s definitely worth addressing. I was implicitly thinking that strict replaceability was the philosophically interesting/objectionable claim. The mere possibility of high-stakes swamping seems a bit more generic, and less distinctive to longtermism. E.g. neartermists may be equally committed to killing (or failing to save) one innocent in order to save a sufficiently large number of other, already-existing people. In general, not wanting to be sacrificed isn’t a good reason to deny that others have value at all. But yeah, worth mentioning this in the paper itself.
My sense is that many people will think killing for replacement is distinctively objectionable, however many people are being added and however good their lives are, even though they accept that in extreme cases its okay to kill one to save very many who already exist. To capture that intuition, you need more than just that you should prioritize current people’s lives a lot, the priority has to be infinite.