Certainly. Some of those values I mentioned might be counted as direct forms of utility, and some might be counted as necessary means to the end of greater total utility later. And the repugnant conclusion can always win by turning up the numbers a bit and making Population Z’s lives pretty decent compared to the smaller Population A.
Partially I am just trying to describe the imagery that occurs to me when I look at the “population A vs population Z” diagram.
I guess I am also using the repugnant conclusion to point out a complaint I have against varieties of utilitarianism that endorse stuff like “tiling the universe with rats on heroin”. To me, once you start talking about very large populations, diversity of experiences is just as crucial as positive valence. That’s because without lots of diversity I start doubting that you can add up all the positive valence without double-counting. For example, if you showed me a planet filled with one million supercomputers all running the exact same emulation of a particular human mind thinking a happy thought, I would be inclined to say, “that’s more like one happy person than like a million happy people”.
I have the same feeling. I have an aversion to utility tiling as you describe it but I can’t exactly pinpoint why other than that I guess I am not a utilitarian. As consequentialists perhaps we should focus more on the ends ends, i.e. aesthetically how much we like the look of future potential universes, rather than looking at the expected utility of said universes. E.g. Star wars is prettier than expansive VN probe network to me so I should prefer that. Of course this is just rejecting utiliarianism again.
Certainly. Some of those values I mentioned might be counted as direct forms of utility, and some might be counted as necessary means to the end of greater total utility later. And the repugnant conclusion can always win by turning up the numbers a bit and making Population Z’s lives pretty decent compared to the smaller Population A.
Partially I am just trying to describe the imagery that occurs to me when I look at the “population A vs population Z” diagram.
I guess I am also using the repugnant conclusion to point out a complaint I have against varieties of utilitarianism that endorse stuff like “tiling the universe with rats on heroin”. To me, once you start talking about very large populations, diversity of experiences is just as crucial as positive valence. That’s because without lots of diversity I start doubting that you can add up all the positive valence without double-counting. For example, if you showed me a planet filled with one million supercomputers all running the exact same emulation of a particular human mind thinking a happy thought, I would be inclined to say, “that’s more like one happy person than like a million happy people”.
I have the same feeling. I have an aversion to utility tiling as you describe it but I can’t exactly pinpoint why other than that I guess I am not a utilitarian. As consequentialists perhaps we should focus more on the ends ends, i.e. aesthetically how much we like the look of future potential universes, rather than looking at the expected utility of said universes. E.g. Star wars is prettier than expansive VN probe network to me so I should prefer that. Of course this is just rejecting utiliarianism again.