When I reflect about the nature of torture it seems obvious that it’s very bad. But I’m not sure how by the nature of reflection on the experience alone we can conclude that there’s no amount of positive bliss that could ever outweigh it. We literally can’t conceive of how good transhuman bliss might be and any case of trying to add up trillions of positive minor experiences seems very sensitive to scope neglect.
Your point that I simply can’t conceive of how good transhuman bliss might be is fair. :) I might indeed change my intuitions if I were to experience it (if that were possible; it’d require a lot of changes to my brain first). I guess we might change our intuitions about many things if we had more insight—e.g., maybe we’d decide that hedonic experience itself isn’t as important as some other things. There’s a question of to what extent we would regard these changes of opinion as moral improvements versus corruption of our original values.
I guess I don’t feel very motivated by the abstract thought that if I were better able to comprehend transhuman-level bliss I might better see how awesome it is and would therefore be more willing to accept the existence of some additional torture in order for more transhuman bliss to exist. I can see how some people might find that line of reasoning motivating, but to me, my reaction is: “No! Stop the extra torture! That’s so obviously the right thing to do.”
That’s true of your current intuitions but I care about what we would care about if we were fully rational and informed. If there was bliss so good that it would be worth experiencing ten minutes of horrific torture for ten minutes of this bliss, it seems that creating this bliss for ungodly numbers of sentient beings is quite an important ethical priority.
Yeah, that’s a fair position to hold. :) The main reason I reject it is that my motivation to prevent torture is stronger than my motivation to care about how my values might change if I were to experience that bliss. Right now I feel the bliss isn’t that important, while torture is. I’d rather continue caring about the torture than allow my loyalty to those enduring horrible experiences to be compromised by starting to care about some new thing that I don’t currently find very compelling.
There’s always a bit of a tricky issue regarding when moral reflection counts as progress and when it counts as just changing your values in ways that your current values would not endorse. At one extreme, it seems that merely learning new factual information (e.g., better data about the number of organisms that exist) is something we should generally endorse. At the other extreme, undergoing neurosurgery or taking drugs to convince you of some different set of values (like the moral urgency of creating paperclips) is generally something we’d oppose. I think having new experiences (especially new experiences that would require rewiring my brain in order to have them) falls somewhere in the middle between these extremes. It’s unclear to me how much I should merely count it as new information versus how much I should see it as hijacking my current suffering-focused values. A new hedonic experience is not just new data but also changes one’s motivations to some degree.
The other problem with the idea of caring about what we would care about upon further reflection is that what we would care about upon further reflection could be a lot of things depending on exactly how the reflection process occurs. That’s not necessarily a reason against moral reflection at all, and I still like to do moral reflection, but it does at least reduce my feeling that moral reflection is definitely progress rather than just value drift.
Here’s an intuition pump: Is there any number of elegant scientific discoveries made in a Matrix, where no sentient beings at all would benefit from technologies derived from those discoveries, that would justify murdering someone? Scientific discoveries do seem valuable, and many people have the intuition that they’re valuable independent of their applications. But is it scope neglect to say that whatever their value, that value just couldn’t be commensurable with hedonic wellbeing? If not, what is the problem in principle with saying the same for happiness and suffering?
Fair enough, I don’t either. But there are some non-hedonic things that I have some intuition are valuable independent of hedonics—it’s just that I reject this intuition upon reflection (just as I reject the intuition that happiness is valuable independent of relief of suffering upon reflection). Is there anything other than hedonic well-being that you have an intuition is independently good or bad, even if you don’t endorse that intuition?
When I reflect about the nature of torture it seems obvious that it’s very bad. But I’m not sure how by the nature of reflection on the experience alone we can conclude that there’s no amount of positive bliss that could ever outweigh it. We literally can’t conceive of how good transhuman bliss might be and any case of trying to add up trillions of positive minor experiences seems very sensitive to scope neglect.
Your point that I simply can’t conceive of how good transhuman bliss might be is fair. :) I might indeed change my intuitions if I were to experience it (if that were possible; it’d require a lot of changes to my brain first). I guess we might change our intuitions about many things if we had more insight—e.g., maybe we’d decide that hedonic experience itself isn’t as important as some other things. There’s a question of to what extent we would regard these changes of opinion as moral improvements versus corruption of our original values.
I guess I don’t feel very motivated by the abstract thought that if I were better able to comprehend transhuman-level bliss I might better see how awesome it is and would therefore be more willing to accept the existence of some additional torture in order for more transhuman bliss to exist. I can see how some people might find that line of reasoning motivating, but to me, my reaction is: “No! Stop the extra torture! That’s so obviously the right thing to do.”
That’s true of your current intuitions but I care about what we would care about if we were fully rational and informed. If there was bliss so good that it would be worth experiencing ten minutes of horrific torture for ten minutes of this bliss, it seems that creating this bliss for ungodly numbers of sentient beings is quite an important ethical priority.
Yeah, that’s a fair position to hold. :) The main reason I reject it is that my motivation to prevent torture is stronger than my motivation to care about how my values might change if I were to experience that bliss. Right now I feel the bliss isn’t that important, while torture is. I’d rather continue caring about the torture than allow my loyalty to those enduring horrible experiences to be compromised by starting to care about some new thing that I don’t currently find very compelling.
There’s always a bit of a tricky issue regarding when moral reflection counts as progress and when it counts as just changing your values in ways that your current values would not endorse. At one extreme, it seems that merely learning new factual information (e.g., better data about the number of organisms that exist) is something we should generally endorse. At the other extreme, undergoing neurosurgery or taking drugs to convince you of some different set of values (like the moral urgency of creating paperclips) is generally something we’d oppose. I think having new experiences (especially new experiences that would require rewiring my brain in order to have them) falls somewhere in the middle between these extremes. It’s unclear to me how much I should merely count it as new information versus how much I should see it as hijacking my current suffering-focused values. A new hedonic experience is not just new data but also changes one’s motivations to some degree.
The other problem with the idea of caring about what we would care about upon further reflection is that what we would care about upon further reflection could be a lot of things depending on exactly how the reflection process occurs. That’s not necessarily a reason against moral reflection at all, and I still like to do moral reflection, but it does at least reduce my feeling that moral reflection is definitely progress rather than just value drift.
Here’s an intuition pump: Is there any number of elegant scientific discoveries made in a Matrix, where no sentient beings at all would benefit from technologies derived from those discoveries, that would justify murdering someone? Scientific discoveries do seem valuable, and many people have the intuition that they’re valuable independent of their applications. But is it scope neglect to say that whatever their value, that value just couldn’t be commensurable with hedonic wellbeing? If not, what is the problem in principle with saying the same for happiness and suffering?
I don’t have the intuition that scientific discoveries are valuable independent of their use for sentient beings.
Fair enough, I don’t either. But there are some non-hedonic things that I have some intuition are valuable independent of hedonics—it’s just that I reject this intuition upon reflection (just as I reject the intuition that happiness is valuable independent of relief of suffering upon reflection). Is there anything other than hedonic well-being that you have an intuition is independently good or bad, even if you don’t endorse that intuition?
Yeah, to some degree I have egalitarian intuitions pre reflection and some other small non utilitarian intuitions.