[Question] Ends vs. Means: To what extent should we be willing to sacrifice nuance and epistemic humility when communicating with people outside EA?

“Do we want accuracy, or do we want them to… make us more aware of extreme events and possibilities?” Thus begins Tyler Cowen’s conversation with Philip Tetlock. After they’ve established that epidemiologists *had* warned that bats+exotic meat consumption could lead to a SARS-like pandemic, Tyler asks, “So those forecasters maybe weren’t entertaining enough?”

As EAs, we tend to be careful about accurately representing the evidence and our interpretations thereof. We try to avoid cherrypicking data and we sometimes signpost our uncertainty with “epistemic status” disclaimers. I love this aspect of the EA community. It’s also very different than the bombastic, (over)confident style of figures who capture the public imagination, e.g. Rutger Bregman, who can confidently state that “UBI works” without feeling the need to advertise his epistemic status or provide disclaimers. Of course, it’s not quite that straightforward, and he probably knows that—for example, under what conditions does it work? How are we measuring whether it works? Steven Pinker’s Better Angels of Our Nature arguably became famous in part because of its bold, easy-to-summarize thesis and his downplaying of limitations, omissions, or counterexamples. It’s also different than dramatic appeals by UN officials. To my chagrin, these quotes get picked up and repeated, even by people who should know better.

Forecasting is but one (albeit important) aspect of EA. We aren’t satisfied with merely predicting what will happen; we want to effect change. Nonetheless, Tyler’s underlying question remains: if we ultimately want people to act, how much should we prize accuracy vs. entertainment value? One way of effecting change is to alter the public consciousness, to introduce or popularize ideas that then become actions. This brings me to the philosophical argument that most effectively convinced research participants to donate a surprise bonus payment to charity at rates statistically higher than a control group. While it was written by Peter Singer and Matthew Lindauer, it conspicuously lacked some of the hallmarks of EA writing. For example, it didn’t say *how many* children suffer from trachoma, it appealed to emotion, and it didn’t compare the cost effectiveness to other possible interventions.

In my advocacy work, I regularly come across claims that are widely used despite being somewhat inaccurate or incomplete. And I’m torn. In our attempts to persuade the broader public, how much should we be willing to sacrifice our attachment to our internal commitment to accuracy, including in terms of epistemic modesty and hedging? In brief, to what extent do the ends justify the means?

-----

Note:

I see parallels between the above and the concept of musicians ‘selling out’, i.e. changing one’s sound from what most appeals to their smaller community, to a sound that has wider appeal. That said, I see this as different than the debateover how broad the EA community should be. I’m not calling for a change to our internal style of communication—again, I love it and wish it were more common. Rather, I’m asking how far we should be willing to go in sacrificing our internal values of when trying to affect the public consciousness—assuming (safely, I think) that a style of writing that works for us doesn’t necessarily work in the wider world.

No comments.