You (and the 5 people who agreed) are blowing my mind right now.
Based on the last paragraph, it sounds like you would support a world full of opiate users—provided there was a sustainable supply of opiates.
The first paragraph is what’s blowing my mind though. When I was a baby, I’m pretty sure I would have told you that a room with toys and sweets would maximize my happiness. I guess you could argue that I’d eventually find out that it would not sustain my long term happiness, but I really do think some amount of suffering ensures happiness in the future. Perhaps this is overly simple, but I’m sure you have fasted at some point (intentionally or not) and that you greatly appreciated your next meal as a result.
Lastly, you separate knowledge and feelings from suffering, but I’m not sure this can be done. My parents told me not to do X because it would hurt, but I did not learn until I experienced X firsthand.
I’m amazed that so many EAs apparently think this way. I don’t want to be mean, but I’m curious as to what altruistic actions you have taken in your life? Really looking forward to your reply.
I think your argument is actually two:
1) It is not obvious how to maximize happiness, and some obvious-seeming strategies to maximize happiness will not in fact maximize happiness.
2) you shouldn’t maximize happiness
(1) is true, I think most EAs agree with it, most people in general agree with it, I agree with it, and it’s pretty unrelated to (2). It means maximizing happiness might be difficult, but says nothing about whether it’s theoretically the best thing to do.
Relatedly, I think a lot of EAs agree that it is sometimes indeed the fact that to maximize happiness, we must incur some suffering. To obtain good things, we must endure some bad. Not realizing that and always avoiding suffering would indeed have bad consequences. But the fact that that is true, and important, says nothing about whether it is good. It is the case now that eating the food I like most would make me sick, but doesn’t tell me whether I should modify myself to enjoy healthier foods more, if I was able to do so.
Put differently, is the fact that we must endure suffering to get happiness sometimes good in itself, or is it an inconvenient truth we should (remember, but) change, if possible? That’s a hard question, and I think it’s easy to slip into the trap of telling people they are ignoring a fact about the world to avoid hard ethical questions about whether the world can and should be changed.
I agree that points 1 and 2 are unrelated, but I think most people outside EA would agree that a universe of happy bricks is bad. (As I argued in a previous post, it’s pretty indistinguishable from a universe of paperclips.) This is one problem that I (and possibly others) have with EA.
I second this! I’m one of the many people who think that maximizing happiness would be terrible. (I mean, there would be worse things you could do, but compared to what a normal, decent person would do, it’s terrible.)
The reason is simple: when you maximize something, by definition that means being willing to sacrifice everything else for the sake of that thing. Depending on the situation you are in, you might not need to sacrifice anything else; in fact, depending on the situation, maximizing that one thing might lead to lots of other things as a bonus—but in principle, if you are maximizing something, then you are willing to sacrifice everything else for the sake of it. Justice. Beauty. Fairness. Equality. Friendship. Art. Wisdom. Knowledge. Adventure. The list goes on and on. If maximizing happiness required sacrificing all of those things, such that the world contained none of them, would you still think it was the right thing to do? I hope not.
(Moreover, based on the laws of physics as we currently understand them, maximizing happiness WILL require us to sacrifice all of the things mentioned above, except possibly Wisdom and Knowledge, and even they will be concentrated in one being or kind of being.)
This is a problem with utilitarianism, not EA, but EA is currently dominated by utilitarians.
I suspect that happiness and well-being are uncorrelated. Just a guess. I am biased as I believe I have grown as a result of changes which were the result of suffering. Your point is valid though—if we could control our environment would altruists seek to create an opiate-type effect on all people? I guess it’s a question that doesn’t need an answer anytime soon.
I suspect that happiness and well-being are uncorrelated.
How are you defining wellbeing such that it’s uncorrelated with happiness?
I am biased as I believe I have grown as a result of changes which were the result of suffering.
Perhaps you misunderstand me. I believe you. I think that probably every human and most animals have, at some point, learned something useful from an experience that involved suffering. I have, you have, all EAs have, everyone has. Negative subjective wellbeing arising from maladaptive behavior is evolutionarily useful. Natural selection favored those that responded to negative experiences, and did so by learning.
I just think it’s sad and shitty that the world is that way. I would very much prefer a world where we could all have equally or more intense and diverse positive experiences without suffering for them. I know that is not possible (or close to it) right now, but I refuse to let the limitations of my capabilities drive me to self-deception.
I think I understand your point. Opiates have a lot of negative connotations. Maybe a nervous system whose pleasure sensors are constantly triggered is a better example. I should have said that I am biased by the fact that I live in an environment where this isn’t possible. You explained it more simply.
Well-being is very tricky to define, isn’t it? I like it a lot more than ‘maximizing happiness’ or ‘minimizing suffering,’ which was mostly what inspired the OP. I guess we don’t know enough about it to define it perfectly, but as Bill said, do we need to?
You (and the 5 people who agreed) are blowing my mind right now.
Based on the last paragraph, it sounds like you would support a world full of opiate users—provided there was a sustainable supply of opiates.
The first paragraph is what’s blowing my mind though. When I was a baby, I’m pretty sure I would have told you that a room with toys and sweets would maximize my happiness. I guess you could argue that I’d eventually find out that it would not sustain my long term happiness, but I really do think some amount of suffering ensures happiness in the future. Perhaps this is overly simple, but I’m sure you have fasted at some point (intentionally or not) and that you greatly appreciated your next meal as a result.
Lastly, you separate knowledge and feelings from suffering, but I’m not sure this can be done. My parents told me not to do X because it would hurt, but I did not learn until I experienced X firsthand.
I’m amazed that so many EAs apparently think this way. I don’t want to be mean, but I’m curious as to what altruistic actions you have taken in your life? Really looking forward to your reply.
I think your argument is actually two: 1) It is not obvious how to maximize happiness, and some obvious-seeming strategies to maximize happiness will not in fact maximize happiness. 2) you shouldn’t maximize happiness
(1) is true, I think most EAs agree with it, most people in general agree with it, I agree with it, and it’s pretty unrelated to (2). It means maximizing happiness might be difficult, but says nothing about whether it’s theoretically the best thing to do.
Relatedly, I think a lot of EAs agree that it is sometimes indeed the fact that to maximize happiness, we must incur some suffering. To obtain good things, we must endure some bad. Not realizing that and always avoiding suffering would indeed have bad consequences. But the fact that that is true, and important, says nothing about whether it is good. It is the case now that eating the food I like most would make me sick, but doesn’t tell me whether I should modify myself to enjoy healthier foods more, if I was able to do so.
Put differently, is the fact that we must endure suffering to get happiness sometimes good in itself, or is it an inconvenient truth we should (remember, but) change, if possible? That’s a hard question, and I think it’s easy to slip into the trap of telling people they are ignoring a fact about the world to avoid hard ethical questions about whether the world can and should be changed.
I agree that points 1 and 2 are unrelated, but I think most people outside EA would agree that a universe of happy bricks is bad. (As I argued in a previous post, it’s pretty indistinguishable from a universe of paperclips.) This is one problem that I (and possibly others) have with EA.
I second this! I’m one of the many people who think that maximizing happiness would be terrible. (I mean, there would be worse things you could do, but compared to what a normal, decent person would do, it’s terrible.)
The reason is simple: when you maximize something, by definition that means being willing to sacrifice everything else for the sake of that thing. Depending on the situation you are in, you might not need to sacrifice anything else; in fact, depending on the situation, maximizing that one thing might lead to lots of other things as a bonus—but in principle, if you are maximizing something, then you are willing to sacrifice everything else for the sake of it. Justice. Beauty. Fairness. Equality. Friendship. Art. Wisdom. Knowledge. Adventure. The list goes on and on. If maximizing happiness required sacrificing all of those things, such that the world contained none of them, would you still think it was the right thing to do? I hope not.
(Moreover, based on the laws of physics as we currently understand them, maximizing happiness WILL require us to sacrifice all of the things mentioned above, except possibly Wisdom and Knowledge, and even they will be concentrated in one being or kind of being.)
This is a problem with utilitarianism, not EA, but EA is currently dominated by utilitarians.
I suspect that happiness and well-being are uncorrelated. Just a guess. I am biased as I believe I have grown as a result of changes which were the result of suffering. Your point is valid though—if we could control our environment would altruists seek to create an opiate-type effect on all people? I guess it’s a question that doesn’t need an answer anytime soon.
How are you defining wellbeing such that it’s uncorrelated with happiness?
Perhaps you misunderstand me. I believe you. I think that probably every human and most animals have, at some point, learned something useful from an experience that involved suffering. I have, you have, all EAs have, everyone has. Negative subjective wellbeing arising from maladaptive behavior is evolutionarily useful. Natural selection favored those that responded to negative experiences, and did so by learning.
I just think it’s sad and shitty that the world is that way. I would very much prefer a world where we could all have equally or more intense and diverse positive experiences without suffering for them. I know that is not possible (or close to it) right now, but I refuse to let the limitations of my capabilities drive me to self-deception.
(my views are my own, not my employer’s)
I think I understand your point. Opiates have a lot of negative connotations. Maybe a nervous system whose pleasure sensors are constantly triggered is a better example. I should have said that I am biased by the fact that I live in an environment where this isn’t possible. You explained it more simply.
Well-being is very tricky to define, isn’t it? I like it a lot more than ‘maximizing happiness’ or ‘minimizing suffering,’ which was mostly what inspired the OP. I guess we don’t know enough about it to define it perfectly, but as Bill said, do we need to?