Thanks, this is one of my favourite responses here. I appreciated your sharing your mental imagery and listing out some consequences of that imagery. I think I am more inclined than you to say that many people alive today have lives not worth living, but you address confusion about that point in another comment. And while I’m more pro-hedonium than you I also wonder about “tiling” issues.
Do your intuitions about this stay consistent if you reverse the ordering? That is, as I think another comment on this post said elsewhere, if you start with a large population of just-barely-happy people, and then replace them with a much smaller population of very happy people, does that seem like a good trade to you?
Yes, my intuition stays the same if the ordering is reversed; population A seems better than population Z and that’s that.
(For instance, if the population of an isolated valley had grown so much, and people had subdivided their farmland, to the point that each plot of land was barely enough for subsistence and the people regularly suffered conflict and famine, in most situations I would think it good if those people voluntarily made a cultural change towards having fewer children, such that over a few generations the population would reduce to say 1⁄3 the original level, and everyone had enough buffer that they could live in peace with plenty to eat and live much happier lives. Of course I would have trouble “wishing people into nonexistence” depending on how much the metaphysical operation seemed to resemble snuffing out an existing life… I would always be inclined to let people live out their existing lives.)
Furthermore, I could even be tempted into accepting a trade of Population A (whose lives are already quite good, much better than barely-worth-living) for a utility-monster style even-smaller population of extremely good lives. But at this point I should clarify that although I might be a utilitarian, I am not a “hedonic” utilitarian and I find it weird that people are always talking about positive emotional valence of experience rather than a more complex basket of values. I already mentioned how I value diversity of experience. I also highly value something like intelligence or “developedness of consciousness”:
It seems silly to me that the ultimate goal becomes Superhappy states of incredible joy and ecstasy. Perhaps this is a failure of my imagination, since I am incapable of really picturing just how good Superhappy states would be. Or perhaps I have cultural blinders that try to ward me off of wireheading (via drug addiction, etc) by indoctrinating me to believe statements like “life isn’t all about happiness; being connected to reality & other people is important, and having a deep understanding the universe is better than just feeling joyful”.
Imagine the following choice: “Take the blue pill and you’ll experience unimaginable joy for the rest of your life (not just one-note heroin-esque joy, but complex joy that cycles through the multiple different shades of positive feeling that the human mind can experience). Take the red pill, and you’ll experience a massive increase in the clarity of your consciousness together with a gigantic boost in IQ to superhuman levels, allowing you to have many complex experiences that are currently out of reach for you, just like how rats are incapable of using language, understanding death, etc. But despite all those revelations, your net happiness level will be totally similar to your current life.” Obviously the joy has its appeal—both are great options! -- but I would take the red pill.
Although I care about the suffering of animals like chimps and factory-farmed chickens and would incorporate it into my utilitarian calculus, I also think that there is a sense in which no number of literal-rats-on-heroin could fully substitute for a human. If you offered me to trade 1 human life for creating a planet with 1 quadrillion rats on heroin, I’d probably take that deal over and over for the first few thousand button-presses. But I wouldn’t just keep going until Earth ran out of people, because I’d never trade away the last complex, intelligent human life to just get one more planet of blissed-out lower life forms.
By contrast, I’d have far fewer qualms going the other way, and trading Earth’s billions of humans for a utopian super-civilization with mere millions of super-enhanced, godlike transhuman intelligences.
Even with my basket of Valence + Diversity-of-experience + Level-of-consciousness, I still expect that utilitarianism of any kind is more like a helpful guide for doing cost-benefit calculations than a final moral theory where we can expect all its assumptions (for instance that moral value scales absolutely linearly forever when you add more lives to the pile) to robustly hold in extreme situations. I think this belief is compatible with being very very utilitarian compared to most ordinary people—just like how I believe that GDP growth is an imperfect proxy for what we want from our civilization, but I am still very very pro economic growth moreso than ordinary people.
Thanks, this is one of my favourite responses here. I appreciated your sharing your mental imagery and listing out some consequences of that imagery. I think I am more inclined than you to say that many people alive today have lives not worth living, but you address confusion about that point in another comment. And while I’m more pro-hedonium than you I also wonder about “tiling” issues.
Do your intuitions about this stay consistent if you reverse the ordering? That is, as I think another comment on this post said elsewhere, if you start with a large population of just-barely-happy people, and then replace them with a much smaller population of very happy people, does that seem like a good trade to you?
Yes, my intuition stays the same if the ordering is reversed; population A seems better than population Z and that’s that. (For instance, if the population of an isolated valley had grown so much, and people had subdivided their farmland, to the point that each plot of land was barely enough for subsistence and the people regularly suffered conflict and famine, in most situations I would think it good if those people voluntarily made a cultural change towards having fewer children, such that over a few generations the population would reduce to say 1⁄3 the original level, and everyone had enough buffer that they could live in peace with plenty to eat and live much happier lives. Of course I would have trouble “wishing people into nonexistence” depending on how much the metaphysical operation seemed to resemble snuffing out an existing life… I would always be inclined to let people live out their existing lives.)
Furthermore, I could even be tempted into accepting a trade of Population A (whose lives are already quite good, much better than barely-worth-living) for a utility-monster style even-smaller population of extremely good lives. But at this point I should clarify that although I might be a utilitarian, I am not a “hedonic” utilitarian and I find it weird that people are always talking about positive emotional valence of experience rather than a more complex basket of values. I already mentioned how I value diversity of experience. I also highly value something like intelligence or “developedness of consciousness”:
It seems silly to me that the ultimate goal becomes Superhappy states of incredible joy and ecstasy. Perhaps this is a failure of my imagination, since I am incapable of really picturing just how good Superhappy states would be. Or perhaps I have cultural blinders that try to ward me off of wireheading (via drug addiction, etc) by indoctrinating me to believe statements like “life isn’t all about happiness; being connected to reality & other people is important, and having a deep understanding the universe is better than just feeling joyful”.
Imagine the following choice: “Take the blue pill and you’ll experience unimaginable joy for the rest of your life (not just one-note heroin-esque joy, but complex joy that cycles through the multiple different shades of positive feeling that the human mind can experience). Take the red pill, and you’ll experience a massive increase in the clarity of your consciousness together with a gigantic boost in IQ to superhuman levels, allowing you to have many complex experiences that are currently out of reach for you, just like how rats are incapable of using language, understanding death, etc. But despite all those revelations, your net happiness level will be totally similar to your current life.” Obviously the joy has its appeal—both are great options! -- but I would take the red pill.
Although I care about the suffering of animals like chimps and factory-farmed chickens and would incorporate it into my utilitarian calculus, I also think that there is a sense in which no number of literal-rats-on-heroin could fully substitute for a human. If you offered me to trade 1 human life for creating a planet with 1 quadrillion rats on heroin, I’d probably take that deal over and over for the first few thousand button-presses. But I wouldn’t just keep going until Earth ran out of people, because I’d never trade away the last complex, intelligent human life to just get one more planet of blissed-out lower life forms.
By contrast, I’d have far fewer qualms going the other way, and trading Earth’s billions of humans for a utopian super-civilization with mere millions of super-enhanced, godlike transhuman intelligences.
Even with my basket of Valence + Diversity-of-experience + Level-of-consciousness, I still expect that utilitarianism of any kind is more like a helpful guide for doing cost-benefit calculations than a final moral theory where we can expect all its assumptions (for instance that moral value scales absolutely linearly forever when you add more lives to the pile) to robustly hold in extreme situations. I think this belief is compatible with being very very utilitarian compared to most ordinary people—just like how I believe that GDP growth is an imperfect proxy for what we want from our civilization, but I am still very very pro economic growth moreso than ordinary people.