Epistemic deference is the process of updating one’s beliefs in response to what others appear to believe, even if one ignores the reasons for those beliefs or does not find those reasons persuasive. The question of when, how, and to what extent a rational agent should defer to others has been studied—from somewhat different angles—by philosophers working in social epistemology and by economists working in game theory.
Further reading
Aird, Michael (2020) Collection of discussions of epistemic modesty, “rationalist*EA exceptionalism”, and similar, LessWrong, March 30.
Many additional resources on this topic or related topics.
LessWrong (2021) Modest epistemology, LessWrong Wiki, April 7.
Frances, Bryan & Jonathan Matheson (2018) Disagreement, Stanford Encyclopedia of Philosophy, February 23 (updated 13 November 2019).
Huemer, Michael (2019) On challenging the experts, Fake Nous, July 6.
Related entries
altruistic coordination | discussion norms | expertise | independent impressions | inside vs. outside view | moral trade | principle of epistemic deference
Deference is one part of social epistemology, and potentially it’s slightly inelegant to have a tag for the conjunction of a topic and a proper part of that topic. So this tag could perhaps be split into two: “epistemic deference” and “social epistemology”. Or you could just have an “epistemic deference” tag, in case there isn’t enough stuff to go under the “social epistemology” tag.
If you go for having a separate deference tag, I think it’s better to call it “epistemic deference”, since the word “deference” can also be used in other, non-epistemic senses.
How to balance inside vs outside views doesn’t seem to be part of social epistemology, strictly speaking, since the problem arises for individuals as well.
Thanks. I agree that the current name is not fully satisfactory.
I just spent some time looking into the social epistemology literature, and I think the only part of it of sufficient relevance to cover on the Wiki is the part concerned with the epistemology of disagreement. The other main branches, namely the attribution of epistemic states to collective agents and the status of belief based on testimony, seem significantly less relevant. On the other hand, the body of the entry mentions various other phenomena of interest, such as balancing inside and outside views, and avoiding anchoring and information cascades. Insofar as we want to cover these topics, I think we should do so in separate articles, rather than amalgamating them all in a single entry without a crisp focus.
In light of this, my proposal is to rename this article epistemic deference and incorporate into it any relevant content from the epistemic disagreement article (which was imported from EA Concepts). We already have an article on inside vs. outside view, and articles on information cascade and anchoring could be created, if needed.
Let me know if you are happy with these changes. (This is addressed not only to Stefan, but to anyone reading this comment.)
Thanks, that sounds good.
Okay, I changed the title, updated the entry, and deleted the other entry. I’m not sure what to do with that content, so I’m copying it below for the time being.
--
Sometimes we find that we disagree with smart, well-informed people about how the world is or ought to be, even if those people seem to share our prior beliefs, values, our evidence. If we think that there is a restricted range of credences that are rational on the same prior beliefs and evidence, then disagreement with others is evidence that at least one of us has made a mistake in our reasoning (White 2005). This is not so surprising in cases where the evidence is complex and difficult to assess.
What should we do when we encounter this kind of disagreement? The responses to this problem range from more “conciliatory” responses, which say that we ought to move closer to the views of our peers in cases of disagreement, to less conciliatory responses, which say that we ought to stick to our guns if we haven’t made a mistake.
More conciliatory responses, such as the view that we should treat ourselves and others like truth thermometers (White 2009), offer more actionable advice than radically non-conciliatory views. The latter say that you should stick with your own views if you are right and move to your interlocutor’s views if they are right, even though you don’t know which of you is right. And sometimes we want to treat the testimony of others as evidence, even if we don’t have access to their reasoning. Results like Condorcet’s jury theorem (Wikpedia 2005) suggest that if many independent agents converge on the same answer to a question, we should treat that as good evidence that the answer they have converged on is correct.
Bibliography
Goldman, Alvin & Cailin O’Connor (2019) Peer disagreement, in ‘Social epistemology’, Stanford Encyclopedia of Philosophy, August 28 (updated 2 March 2021).
White, Roger (2005) Epistemic permissiveness, Philosophical Perspectives, vol. 19, pp. 445–459.
Discusses how much evidence determines appropriate beliefs, and whether it might be epistemically permissible to believe a range of things.
White, Roger (2009) On treating oneself and others as thermometers, Episteme, vol. 6, pp. 233–250.
Wikipedia (2005) Condorcet’s jury theorem, Wikipedia, September 19 (updated 14 March 2021).
Related entries
discussion norms | EA culture | epistemic humility | principle of epistemic deference
Previous name for this tag was “epistemic humility”, which seems confusing because ‘modesty vs. humility’ is an old distinction on LessWrong, and ‘epistemic humility’ here is referring more to the ‘modesty’ side than the ‘humility’ side. I’ve changed it to “deference and social epistemology” for now.
Twelve Virtues of Rationality: “To be humble is to take specific actions in anticipation of your own errors. [...] Who are most humble? Those who most skillfully prepare for the deepest and most catastrophic errors in their own beliefs and plans. [...] To be human is to make ten thousand errors. No one in this world achieves perfection.”
The Proper Use of Humility: “This is social modesty, not humility. It has to do with regulating status in the tribe, rather than scientific process. If you ask someone to ‘be more humble,’ by default they’ll associate the words to social modesty—which is an intuitive, everyday, ancestrally relevant concept. Scientific humility is a more recent and rarefied invention, and it is not inherently social. Scientific humility is something you would practice even if you were alone in a spacesuit, light years from Earth with no one watching. Or even if you received an absolute guarantee that no one would ever criticize you again, no matter what you said or thought of yourself. You’d still double-check your calculations if you were wise.”
Status Regulation and Anxious Underconfidence: “I try to be careful to distinguish the virtue of avoiding overconfidence, which I sometimes call ‘humility,’ from the phenomenon I’m calling ‘modest epistemology.’”
Thanks, I think the change makes sense.
More broadly, I wonder whether this and the epistemic disagreement articles should be merged. We could use “epistemology of disagreement” as the name for the new article.
FWIW, I think that:
The new name seems to me like an improvement from the old name (which I wrote)
When making this tag, I’d forgotten that there was a relevant modesty vs humility distinction here; I had considered them synonymous when originally naming the tag
This is a bit weird, because I’d read the things Rob cites in his comment, and in fact I even cited one myself earlier today
I guess this is an indication that it’s good I now use Anki sometimes (rather than never)!
It does seem good to avoid creating confusion by conflating the two terms here
The two articles should probably be merged
The new name for this tag seems better to me than “epistemology of disagreement”
I think the new name better fits what I originally had in mind as the scope of the tag
E.g., I wanted it to cover something like how people tend to update on other people’s opinions and how that should affect the way we communicate (e.g., being clear about when we’re reporting independent impressions vs all-things-considered beliefs)
One reason (maybe the main one): I think many situations involving “deference” don’t really involve what we’d normally think of as “disagreement”
It might be going from having no opinion on a topic at all to now having the opinion Toby Ord has
I think (but am not sure) that the subfield in epistemology that studies the “epistemology of disagreement” is concerned with deference in general and not just disagreement in particular, but I’m happy to keep the current name. In general, I would prefer that articles be named after existing scholarly fields when they have the same or very similar scope, though I agree that in this case there may be a tradeoff between that desideratum and the independently desirable goal of adopting a name that describes its subject matter more transparently.
I just noticed that the epistemic disagreement article does focus exclusively on disagreement and doesn’t discuss deference more broadly, so I’m hesitant to just move it here without first making substantial edits. Anyway, I’ve made a note to take a closer look at some point, though if anyone wants to try merging the two, go for it.
Makes sense to me!
Maybe eventually this entry should mention the idea of an ideological Turing test? And/or maybe the Epistemology entry should? And/or maybe that concept should get its own entry?
Here’s one post that mentions the idea (I haven’t thoroughly looked for others): https://forum.effectivealtruism.org/posts/ReJT7ck9Em2xQANSz/how-we-can-make-it-easier-to-change-your-mind-about-cause
I like the idea of having a short entry on that concept. I’ll see if I can draft something.