From an EA point of view, doing the most good is the most important thing, so socially-motivated virtue signaling is defensible if it consequentially results in more good.
EAs may be more likely to think this, but this is not what I’m saying. I’m saying there is real information value in signals of genuine virtue and we can’t afford to leave that information on the table. I think it’s prosocial to monitor your own virtue and offer proof of trustworthiness (and other specific virtues) to others, not because fake signals somehow add up to good social consequences, but because it helps people to be more virtuous.
Rationalists are erring so far in the direction of avoiding false or manipulative signals that they are operating in the dark, when at the same time they are advocating more and more opaque and uncertain ways to have impact. I think that by ignoring virtue and rejecting virtue signals, rationalists are not treating the truth as “the most important thing”. (In fact I think this whole orientation is a meta-virtue-signal that they don’t need validation and they don’t conform—which is a real virtue, but I think is getting in the way of more important info.) It’s contradicting our values of truth and evidence-seeking not to get what information we can about character, at least own own characters.
lol, see the version of this on less wrong to have your characterization of the rationalist community confirmed: https://www.lesswrong.com/posts/hpebyswwhiSA4u25A/virtue-signaling-is-sometimes-the-best-or-the-only-metric-we
EAs may be more likely to think this, but this is not what I’m saying. I’m saying there is real information value in signals of genuine virtue and we can’t afford to leave that information on the table. I think it’s prosocial to monitor your own virtue and offer proof of trustworthiness (and other specific virtues) to others, not because fake signals somehow add up to good social consequences, but because it helps people to be more virtuous.
Rationalists are erring so far in the direction of avoiding false or manipulative signals that they are operating in the dark, when at the same time they are advocating more and more opaque and uncertain ways to have impact. I think that by ignoring virtue and rejecting virtue signals, rationalists are not treating the truth as “the most important thing”. (In fact I think this whole orientation is a meta-virtue-signal that they don’t need validation and they don’t conform—which is a real virtue, but I think is getting in the way of more important info.) It’s contradicting our values of truth and evidence-seeking not to get what information we can about character, at least own own characters.