”most naive attempts to do so can easily do more harm than good.”
I agree that factoring in optics can accidentally do harm, but I think if we’re trying to approximately maximise EV, we should be happy to risk doing harm.
I’m sure factoring in optics will sometimes lead to optics being overweighted, but I’m still unclear on why you think optics would be overweighted more often than not, and why ignoring optics is a better solution to overweighting than factoring it in. If we’re worried about overweighting it, can’t we just weight it less?
If I’m interpreting your comment correctly, you’re arguing that systematic biases to how we would estimate optics value mean that we’re better off not factoring in optics into the EV calculations.
There are other systematic biases to the “wanting to fit in” bias that affect EV calculations—self-serving biases might cause EAs to overestimate the value of owning nice conference venues, or the value of time saved through meal delivery services or Ubers. I think consistency would require you to argue that we should not factor in the value of these things into EV calculations—I’d be interested to get your thoughts on this.
(My view is that we should continue to factor everything in and just consciously reduce the weighting of things that we think we might be prone to overweighting or overvaluing.)
I’m sure factoring in optics will sometimes lead to optics being overweighted, but I’m still unclear on why you think optics would be overweighted more often than not, and why ignoring optics is a better solution to overweighting than factoring it in.
My main argument is that “naive” weighting of optics is common and can easily do more harm than good. And that sophisticated weighting of optics is just really hard. Even if you’re aware of this problem! If “not weighing optics at all” is a better strategy than naively weighting optics, then I recommend not weighting it at all.
And I think that people the kind of people who talk a lot about weighting optics have systematic biases towards conservatism and overweighting optics (obviously there are people who have the reverse biases and should care way more!!).
I think there are clearly sophisticated strategies that are better, but I’m concerned that they’re hard to follow in practice, while the strategy of “don’t overthink optics” is fairly easy to follow.
If we’re worried about overweighting it, can’t we just weight it less?
I think this is true in theory but very hard to do in practice—it’s just really hard to account for your biases right, even if you’re aware of them and trying to correct for them!
There are other systematic biases to the “wanting to fit in” bias that affect EV calculations—self-serving biases might cause EAs to overestimate the value of owning nice conference venues, or the value of time saved through meal delivery services or Ubers. I think consistency would require you to argue that we should not factor in the value of these things into EV calculations—I’d be interested to get your thoughts on this.
This is a fair point! I haven’t thought much about this, but do think that a similar argument goes through there. I think that time saved feels easier to me—it’s much easier to quantify, and putting a price on someone’s time is a common enough calculation that you can at least try to be consistent.
Nice conference venues is harder to quantify and easy to be self-serving with, and I do agree there. (Though if the person making the decision spends the conference worked off their feet doing operations/management, and dealing with dumb BS to do with having a nice venue like historical building protection, I’m less concerned about bias!)
EDIT: And to clarify, my position is not “optics never matter and it’s a mistake to think about them”. I just think that it’s difficult to do right, and important to be careful about this, and often reasonable to decide it’s not a significant factor. Eg, I can see the case for not caring about optics with Wytham, but I think that if you’re eg killing people for their organs, optics are a pretty relevant consideration! (Along with all of the other reasons that’s a terrible and immoral idea)
Thanks for copying your comment!
”most naive attempts to do so can easily do more harm than good.”
I agree that factoring in optics can accidentally do harm, but I think if we’re trying to approximately maximise EV, we should be happy to risk doing harm.
I’m sure factoring in optics will sometimes lead to optics being overweighted, but I’m still unclear on why you think optics would be overweighted more often than not, and why ignoring optics is a better solution to overweighting than factoring it in. If we’re worried about overweighting it, can’t we just weight it less?
If I’m interpreting your comment correctly, you’re arguing that systematic biases to how we would estimate optics value mean that we’re better off not factoring in optics into the EV calculations.
There are other systematic biases to the “wanting to fit in” bias that affect EV calculations—self-serving biases might cause EAs to overestimate the value of owning nice conference venues, or the value of time saved through meal delivery services or Ubers. I think consistency would require you to argue that we should not factor in the value of these things into EV calculations—I’d be interested to get your thoughts on this.
(My view is that we should continue to factor everything in and just consciously reduce the weighting of things that we think we might be prone to overweighting or overvaluing.)
My main argument is that “naive” weighting of optics is common and can easily do more harm than good. And that sophisticated weighting of optics is just really hard. Even if you’re aware of this problem! If “not weighing optics at all” is a better strategy than naively weighting optics, then I recommend not weighting it at all.
And I think that people the kind of people who talk a lot about weighting optics have systematic biases towards conservatism and overweighting optics (obviously there are people who have the reverse biases and should care way more!!).
I think there are clearly sophisticated strategies that are better, but I’m concerned that they’re hard to follow in practice, while the strategy of “don’t overthink optics” is fairly easy to follow.
I think this is true in theory but very hard to do in practice—it’s just really hard to account for your biases right, even if you’re aware of them and trying to correct for them!
This is a fair point! I haven’t thought much about this, but do think that a similar argument goes through there. I think that time saved feels easier to me—it’s much easier to quantify, and putting a price on someone’s time is a common enough calculation that you can at least try to be consistent.
Nice conference venues is harder to quantify and easy to be self-serving with, and I do agree there. (Though if the person making the decision spends the conference worked off their feet doing operations/management, and dealing with dumb BS to do with having a nice venue like historical building protection, I’m less concerned about bias!)
EDIT: And to clarify, my position is not “optics never matter and it’s a mistake to think about them”. I just think that it’s difficult to do right, and important to be careful about this, and often reasonable to decide it’s not a significant factor. Eg, I can see the case for not caring about optics with Wytham, but I think that if you’re eg killing people for their organs, optics are a pretty relevant consideration! (Along with all of the other reasons that’s a terrible and immoral idea)