Yeah, it’s difficult to intuit, but I think that’s pretty clearly because we’re bad at imagining the aggregate harm of billions (or trillions) of mosquito bites. One way to reason around this is to think: - I would rather get punched once in the arm than once in the ribs, but I would rather get punched once in the ribs than 10x in the arm - I’m fine with disaggregating, and saying that I would prefer a world where 1 person gets punched in the gut to a world where 10 people get punched in the arm - I’m also fine with multiplying those numbers by 10 and saying that I would prefer 10 people PiG to 100 people PiA - It’s harder to intuit this for really really big numbers, but I am happy to attribute that to a failure of my imagination, rather than some bizarre effect where TU only holds for small populations - I’m also fine intensifying the first harm by a little bit so long as the populations are offset (e.g. I would prefer 1 person punched in the face to 1000 people punched in the arm) - Again, it’s hard to continue to intuit this for really extreme harms and really large populations, but I am more willing to attribute that to cognitive failures and biases than to a bizarre ethical rule
I don’t think you’re forced to say that if a life with x utility is neutral, a life with x − 1 utility is bad. It seems to me that the most plausible version of the OPs approach would have a very wide neutral band.
Great point. But note that if lives of monk-like tranquility are neutral, that makes the Mirrored Repugnant Conclusion harder to accept:
The total view in population ethics implies this Mirrored Repugnant Conclusion.
If lives of monk-like tranquility are neutral, then lives of monk-like tranquility plus a mosquito bite are barely bad, and so the total view implies:
Yeah, it’s difficult to intuit, but I think that’s pretty clearly because we’re bad at imagining the aggregate harm of billions (or trillions) of mosquito bites. One way to reason around this is to think:
- I would rather get punched once in the arm than once in the ribs, but I would rather get punched once in the ribs than 10x in the arm
- I’m fine with disaggregating, and saying that I would prefer a world where 1 person gets punched in the gut to a world where 10 people get punched in the arm
- I’m also fine with multiplying those numbers by 10 and saying that I would prefer 10 people PiG to 100 people PiA
- It’s harder to intuit this for really really big numbers, but I am happy to attribute that to a failure of my imagination, rather than some bizarre effect where TU only holds for small populations
- I’m also fine intensifying the first harm by a little bit so long as the populations are offset (e.g. I would prefer 1 person punched in the face to 1000 people punched in the arm)
- Again, it’s hard to continue to intuit this for really extreme harms and really large populations, but I am more willing to attribute that to cognitive failures and biases than to a bizarre ethical rule
Etc etc.
I don’t think you’re forced to say that if a life with x utility is neutral, a life with x − 1 utility is bad. It seems to me that the most plausible version of the OPs approach would have a very wide neutral band.
Yes, nice point. We could depart from the total view and go for a neutral band. But it’s worth noting that this move comes with problems of its own.