Yeah. Another piece of this that I didn’t fully articulate before is that I think the “honesty” of virtue signaling is very often hard to pin down. I get why people have a visceral and negative reaction to virtue signaling when it’s cynically and transparently being used as a justification or distraction for doing things that are not virtuous at all, and it’s not hard to find examples of people doing this in practice. Even in that scenario, though, I think it’s a mistake to focus on the virtue signaling itself rather than the not-virtuous actions/intentions as the main problem. Like, if you have an agent with few or no moral boundaries who wants to do a selfish thing, why should we be surprised that they’re willing to be manipulative in the course of doing that?
I think cases like these are pretty exceptional though, as are cases when someone is using virtue signaling to express profound and stable convictions. I suspect it’s much more often the case that virtue signaling occupies a sort of ambiguous space where it might not be completely authentic but does at least partly reflect some aspiration towards goodness, on the part of either the person doing it or the community they’re a part of, that is authentic. And I think that aspiration is really important on a community level, or at least any community that I’d want to be a part of, and virtue signaling in practice plays an important role in keeping that alive.
Anyway, since “virtue” is in the eye of the beholder, it would be pretty easy to say that rationalists define “truth-seeking” as virtue and that there’s a whole lot of virtue-signaling on LessWrong around that (see: epistemic status disclaimers, “I’m surprised to hear you say that,” “I’d be happy to accept a bet on this at x:y odds,” etc.)
Even in that scenario, though, I think it’s a mistake to focus on the virtue signaling itself rather than the not-virtuous actions/intentions as the main problem. Like, if you have an agent with few or no moral boundaries who wants to do a selfish thing, why should we be surprised that they’re willing to be manipulative in the course of doing that?
If you think of virtue signalling as a really important coordination mechanism, then abusing that system is additionally very bad on top of the object-level bad thing.
Yeah. Another piece of this that I didn’t fully articulate before is that I think the “honesty” of virtue signaling is very often hard to pin down. I get why people have a visceral and negative reaction to virtue signaling when it’s cynically and transparently being used as a justification or distraction for doing things that are not virtuous at all, and it’s not hard to find examples of people doing this in practice. Even in that scenario, though, I think it’s a mistake to focus on the virtue signaling itself rather than the not-virtuous actions/intentions as the main problem. Like, if you have an agent with few or no moral boundaries who wants to do a selfish thing, why should we be surprised that they’re willing to be manipulative in the course of doing that?
I think cases like these are pretty exceptional though, as are cases when someone is using virtue signaling to express profound and stable convictions. I suspect it’s much more often the case that virtue signaling occupies a sort of ambiguous space where it might not be completely authentic but does at least partly reflect some aspiration towards goodness, on the part of either the person doing it or the community they’re a part of, that is authentic. And I think that aspiration is really important on a community level, or at least any community that I’d want to be a part of, and virtue signaling in practice plays an important role in keeping that alive.
Anyway, since “virtue” is in the eye of the beholder, it would be pretty easy to say that rationalists define “truth-seeking” as virtue and that there’s a whole lot of virtue-signaling on LessWrong around that (see: epistemic status disclaimers, “I’m surprised to hear you say that,” “I’d be happy to accept a bet on this at x:y odds,” etc.)
If you think of virtue signalling as a really important coordination mechanism, then abusing that system is additionally very bad on top of the object-level bad thing.