So, I have a slate of questions that I often ask people to try and better understand them. Recently I realized that one of these questions may not be as open-ended as I’d thought, in the sense that it may actually have a proper answer according to Bayesian rationality. Though, I remain uncertain about this. I’ve also posted this question to the Less Wrong open thread, but I’m curious what Effective Altruists in particular would think about this question. If you’d rather you can private message me your answer. Keep in mind the question is intentionally somewhat ambiguous.
The question is:
Truth or Happiness? If you had to choose between one or the other, which would you pick?
Well, the way the question is formed, there are a number of different tendencies that this question seems to help gauge. One is obviously whether an individual is aware of the difference between instrumental and terminal goals. Another would be what kinds of sacrifices they are willing to make, as well as their degree of risk aversion. In general, I find most people answer truth, but that when faced with an actual situation of this sort, tend to show a preference for happiness.
So far I’m less certain about if particular groups actually answer it one way or another. It seems like cautious, risk averse types favour Happiness, while risk neutral or risk seeking types favour Truth. My sample size is a bit small to make such generalizations though.
Probably the most important understanding I get from this question is just what kind of decision process people use to decide situations of ambiguity and uncertainty, as well as how decisive they are.
A possible explanation is simply that the truth tends to be some information that may or may not be useful. It might, with a small probability, be very useful information, like say, life saving information. The ambiguity of the question means that while you may not be happy with the information, it could conceivably benefit others greatly or not at all. On the other hand, guaranteed happiness is much more certain and concrete. At least, that’s the way I imagine it.
I’ve had at least one person explain their choice as being a matter of truth being harder to get than happiness, because they could always figure out a way to be happy by themselves.
So, I have a slate of questions that I often ask people to try and better understand them. Recently I realized that one of these questions may not be as open-ended as I’d thought, in the sense that it may actually have a proper answer according to Bayesian rationality. Though, I remain uncertain about this. I’ve also posted this question to the Less Wrong open thread, but I’m curious what Effective Altruists in particular would think about this question. If you’d rather you can private message me your answer. Keep in mind the question is intentionally somewhat ambiguous.
The question is:
Truth or Happiness? If you had to choose between one or the other, which would you pick?
All else being equal, I’d pick happiness.
What understanding do you get from this question, out of interest? Do particular groups tend to answer it one way or another?
Well, the way the question is formed, there are a number of different tendencies that this question seems to help gauge. One is obviously whether an individual is aware of the difference between instrumental and terminal goals. Another would be what kinds of sacrifices they are willing to make, as well as their degree of risk aversion. In general, I find most people answer truth, but that when faced with an actual situation of this sort, tend to show a preference for happiness.
So far I’m less certain about if particular groups actually answer it one way or another. It seems like cautious, risk averse types favour Happiness, while risk neutral or risk seeking types favour Truth. My sample size is a bit small to make such generalizations though.
Probably the most important understanding I get from this question is just what kind of decision process people use to decide situations of ambiguity and uncertainty, as well as how decisive they are.
Interesting. I’m struggling to imagine why that might be, any theories?
A possible explanation is simply that the truth tends to be some information that may or may not be useful. It might, with a small probability, be very useful information, like say, life saving information. The ambiguity of the question means that while you may not be happy with the information, it could conceivably benefit others greatly or not at all. On the other hand, guaranteed happiness is much more certain and concrete. At least, that’s the way I imagine it.
I’ve had at least one person explain their choice as being a matter of truth being harder to get than happiness, because they could always figure out a way to be happy by themselves.
I think the hope is that there doesn’t have to be a choice.
Truth, no hesitation.
A big question but why?