One consistent frame I’ve seen with EAs is a much higher emphasis on “How can I frame this to avoid looking bad to as many people as possible?” rather than “How can I frame this to look good and interesting to as many people as possible?”
Something the “cold hard truth about the icebucket challenge” did (correctly I think), is be willing to be controversial and polarizing deliberately. This is something that in general EAs seem to avoid, and there’s a general sense that these sorts of marketing framings are the “dark arts” that one should not touch.
On one hand, I see the argument for how framing the facts in the most positive light is obviously bad for an epistemic culture, and could hurt EA’s reputation; on the other hand, I think EA is so allergic to this that it hurts it. I do think this is a risk aversion bias when it comes to both public perception and epistemic climate, and that EA is irrationally too far towards being cautious.
Another frequent mistake I see along this same vein (although less rare with the higher status people in the movement) is to confuse epistemic and emotional confidence. People often think that if they’re unsure about an opinion, they need to appear unsure of themselves when stating an opinion.
The problem with this in the context of the above post is that appearing unsure of yourself signals low status. The antidote to this is to detach your sure-o-meter from your feeling of confidence, and be able to verbally state your confidence levels without being unsure of yourself. If you do this currently in the EA community, there can be a stigma about epistemic overconfidence that’s difficult to overcome, even though this is the correct way to maximize both epistemic modesty and outside perception.
So to sum my suggestions up for concrete ways that people in organizations could start taking status effects more into account:
Shift more from “how can frame the truth to avoid looking bad?” to “How can I frame the truth to look good?”
Work to detach your emotional and your epistemic confidence, especially in public settings.
The problem with this in the context of the above post is that appearing unsure of yourself signals low status. The antidote to this is to detach your sure-o-meter from your feeling of confidence, and be able to verbally state your confidence levels without being unsure of yourself.
This is one of the most interesting points I’ve seen on the Forum in a long while. It perfectly captures the distinction I feel between certain people who I consider excellent speakers in the EA movement and people who don’t give me that feeling. At first, I thought this was something like high charisma vs. low charisma, but that wasn’t quite right; you don’t need to be charismatic and charming to speak with confidence about your uncertainty.
Relatedly, there are the concepts of ‘uncertainty’ and ‘insecurity’. I think there’s a risk that uncertainty as perceived, and perhaps even experienced, as insecurity. Interestingly, both concepts are translated into one and the same word in Dutch! (“onzekerheid”)
However, I think stating epistemic uncertainty in a very precise and confident way (e.g. “I believe X, and I am 60% certain my hypothesis is correct”) can show meta-confidence and strong epistemics. I would rather learn to be convince while still communicating uncertainties, than learning to hide my epistemic uncertainty.
Also, experts in any domain face this challenge, and useful lessons could be drawn from literature on it, such as this paper (I only read the abstract, it seems useful).
One consistent frame I’ve seen with EAs is a much higher emphasis on “How can I frame this to avoid looking bad to as many people as possible?” rather than “How can I frame this to look good and interesting to as many people as possible?”
Something the “cold hard truth about the icebucket challenge” did (correctly I think), is be willing to be controversial and polarizing deliberately. This is something that in general EAs seem to avoid, and there’s a general sense that these sorts of marketing framings are the “dark arts” that one should not touch.
On one hand, I see the argument for how framing the facts in the most positive light is obviously bad for an epistemic culture, and could hurt EA’s reputation; on the other hand, I think EA is so allergic to this that it hurts it. I do think this is a risk aversion bias when it comes to both public perception and epistemic climate, and that EA is irrationally too far towards being cautious.
Another frequent mistake I see along this same vein (although less rare with the higher status people in the movement) is to confuse epistemic and emotional confidence. People often think that if they’re unsure about an opinion, they need to appear unsure of themselves when stating an opinion.
The problem with this in the context of the above post is that appearing unsure of yourself signals low status. The antidote to this is to detach your sure-o-meter from your feeling of confidence, and be able to verbally state your confidence levels without being unsure of yourself. If you do this currently in the EA community, there can be a stigma about epistemic overconfidence that’s difficult to overcome, even though this is the correct way to maximize both epistemic modesty and outside perception.
So to sum my suggestions up for concrete ways that people in organizations could start taking status effects more into account:
Shift more from “how can frame the truth to avoid looking bad?” to “How can I frame the truth to look good?”
Work to detach your emotional and your epistemic confidence, especially in public settings.
This is one of the most interesting points I’ve seen on the Forum in a long while. It perfectly captures the distinction I feel between certain people who I consider excellent speakers in the EA movement and people who don’t give me that feeling. At first, I thought this was something like high charisma vs. low charisma, but that wasn’t quite right; you don’t need to be charismatic and charming to speak with confidence about your uncertainty.
Relatedly, there are the concepts of ‘uncertainty’ and ‘insecurity’. I think there’s a risk that uncertainty as perceived, and perhaps even experienced, as insecurity. Interestingly, both concepts are translated into one and the same word in Dutch! (“onzekerheid”)
However, I think stating epistemic uncertainty in a very precise and confident way (e.g. “I believe X, and I am 60% certain my hypothesis is correct”) can show meta-confidence and strong epistemics. I would rather learn to be convince while still communicating uncertainties, than learning to hide my epistemic uncertainty.
Also, experts in any domain face this challenge, and useful lessons could be drawn from literature on it, such as this paper (I only read the abstract, it seems useful).