This isn’t directly related to your point, but I think there are a number of practical issues with most attempts at epistemic modesty/deference, that theoretical approaches like this one do not adequately account for.
1) Misunderstanding of what experts actually mean. It is often easier to defer to a stereotype in your head than to fully understand an expert’s views, or a simple approximation thereof.
Dan Luu gives the example of SV investors who “defer” to economists on the issue of discrimination in competitive markets without actually understanding (or perhaps reading) the relevant papers.
In some of those cases, it’s plausible that you’d do better trusting the evidence of your own eyes/intuition over your attempts to understand experts.
2) Misidentifying the right experts. In the US, it seems like the educated public roughly believes that “anybody with a medical doctorate” is approximately the relevant expert class on questions as diverse as nutrition, the fluid dynamics of indoor air flow (if the airflow happens to carry viruses), and the optimal allocation of limited (medical) resources.
More generally, people often default to the closest high-status group/expert to them, without accounting for whether that group/expert is epistemically superior to other experts slightly further away in space or time.
2a) Immodest modesty.* As a specific case/extension of this, when someone identifies an apparent expert or community of experts to defer to, they risk (incorrectly) believing that they have deference (on this particular topic) “figured out” and thus choose not to update on either object- or meta- level evidence that they did not correctly identify the relevant experts. The issue may be exacerbated beyond “normal” cases of immodesty, if there’s a sufficiently high conviction that you are being epistemically modest!
3) Information lag. Obviously any information you receive is to some degree or another from the past, and has the risk of being outdated. Of course, this lag happens for all evidence you have. At the most trivial level, even sensory experience isn’t really in real-time. But I think it should be reasonable to assume that attempts to read expert claims/consensus is disproportionately likely to have a significant lag problem, compared to your own present evaluations of the object-level arguments.
4) Computational complexity in understanding the consensus. Trying to understand the academic consensus (or lack thereof) from the outside might be very difficult, to the point where establishing your own understanding from a different vantage point might be less time-consuming. Unlike 1), this presupposes that you are able to correctly understand/infer what the experts mean, just that it might not be worth the time to do so.
5) Community issues with groupthink/difficulty in separating out beliefs from action. In an ideal world, we make our independent assessments of a situation, report it to the community, in what Kant[1] calls the “public (scholarly) use of reason” and then defer to an all-things-considered epistemically modest view when we act on our beliefs in our private role as citizens.
However, in practice I think it’s plausibly difficult to separate out what you personally believe from what you feel compelled to act on. One potential issue with this is that a community that’s overly epistemically deferential will plausibly have less variation, and lower affordance for making mistakes.
--
*As a special case of that, people may be unusually bad at identifying the right experts when said experts happen to agree with their initial biases, either on the object-level or for meta-level reasons uncorrelated with truth (eg use similar diction, have similar cultural backgrounds, etc)
There are enough individual and practical considerations here (in both directions) that in many situations the actual thing I would advocate for is something like “work out what you would do with both approaches, check against results ‘without fear or favour’, and move towards whatever method is working best for you”.
I agree that lots of these considerations are important. On 2) especially, I agree that being epistemically modest doesn’t make things easy because choosing the right experts is a non-trivial task. One example of this is using AI researchers as the correct expert group on AGI timelines, which I have myself done in the past. AI researchers have shown themselves to be good at producing AI research, not at forecasting long-term AI trends, so it’s really unclear that this is the right way to be modest in this case.
On 4 also—I agree. I think coming to a sophisticated view will often involve deferring to some experts on specific sub-questions using different groups of experts. Like maybe you defer to climate science on what will happen to the climate, philosophers on how to think about future costs, economists on the best way forward, etc. Identifying the correct expert groups is not always straightforward.
Thanks for the reply! One thing you and AGB reminded me of that my original comment elided over is that some of these personal and “practical” considerations apply in both directions. For example for #4 there are many/most cases where understanding expert consensus is easier rather than harder than coming up with your own judgment.
It’d perhaps be interesting if people produced a list of the most important/common practical considerations in either direction, though ofc much of that will be specific to the individual/subject matter/specific situation.
This isn’t directly related to your point, but I think there are a number of practical issues with most attempts at epistemic modesty/deference, that theoretical approaches like this one do not adequately account for.
1) Misunderstanding of what experts actually mean. It is often easier to defer to a stereotype in your head than to fully understand an expert’s views, or a simple approximation thereof.
Dan Luu gives the example of SV investors who “defer” to economists on the issue of discrimination in competitive markets without actually understanding (or perhaps reading) the relevant papers.
In some of those cases, it’s plausible that you’d do better trusting the evidence of your own eyes/intuition over your attempts to understand experts.
2) Misidentifying the right experts. In the US, it seems like the educated public roughly believes that “anybody with a medical doctorate” is approximately the relevant expert class on questions as diverse as nutrition, the fluid dynamics of indoor air flow (if the airflow happens to carry viruses), and the optimal allocation of limited (medical) resources.
More generally, people often default to the closest high-status group/expert to them, without accounting for whether that group/expert is epistemically superior to other experts slightly further away in space or time.
2a) Immodest modesty.* As a specific case/extension of this, when someone identifies an apparent expert or community of experts to defer to, they risk (incorrectly) believing that they have deference (on this particular topic) “figured out” and thus choose not to update on either object- or meta- level evidence that they did not correctly identify the relevant experts. The issue may be exacerbated beyond “normal” cases of immodesty, if there’s a sufficiently high conviction that you are being epistemically modest!
3) Information lag. Obviously any information you receive is to some degree or another from the past, and has the risk of being outdated. Of course, this lag happens for all evidence you have. At the most trivial level, even sensory experience isn’t really in real-time. But I think it should be reasonable to assume that attempts to read expert claims/consensus is disproportionately likely to have a significant lag problem, compared to your own present evaluations of the object-level arguments.
4) Computational complexity in understanding the consensus. Trying to understand the academic consensus (or lack thereof) from the outside might be very difficult, to the point where establishing your own understanding from a different vantage point might be less time-consuming. Unlike 1), this presupposes that you are able to correctly understand/infer what the experts mean, just that it might not be worth the time to do so.
5) Community issues with groupthink/difficulty in separating out beliefs from action. In an ideal world, we make our independent assessments of a situation, report it to the community, in what Kant[1] calls the “public (scholarly) use of reason” and then defer to an all-things-considered epistemically modest view when we act on our beliefs in our private role as citizens.
However, in practice I think it’s plausibly difficult to separate out what you personally believe from what you feel compelled to act on. One potential issue with this is that a community that’s overly epistemically deferential will plausibly have less variation, and lower affordance for making mistakes.
--
*As a special case of that, people may be unusually bad at identifying the right experts when said experts happen to agree with their initial biases, either on the object-level or for meta-level reasons uncorrelated with truth (eg use similar diction, have similar cultural backgrounds, etc)
[1] ha!
This comment is great, strong-upvoted.
There are enough individual and practical considerations here (in both directions) that in many situations the actual thing I would advocate for is something like “work out what you would do with both approaches, check against results ‘without fear or favour’, and move towards whatever method is working best for you”.
Thanks for the compliment!
Yeah that makes sense! I think this is a generally good approach to epistemics/life.
I agree that lots of these considerations are important. On 2) especially, I agree that being epistemically modest doesn’t make things easy because choosing the right experts is a non-trivial task. One example of this is using AI researchers as the correct expert group on AGI timelines, which I have myself done in the past. AI researchers have shown themselves to be good at producing AI research, not at forecasting long-term AI trends, so it’s really unclear that this is the right way to be modest in this case.
On 4 also—I agree. I think coming to a sophisticated view will often involve deferring to some experts on specific sub-questions using different groups of experts. Like maybe you defer to climate science on what will happen to the climate, philosophers on how to think about future costs, economists on the best way forward, etc. Identifying the correct expert groups is not always straightforward.
Thanks for the reply! One thing you and AGB reminded me of that my original comment elided over is that some of these personal and “practical” considerations apply in both directions. For example for #4 there are many/most cases where understanding expert consensus is easier rather than harder than coming up with your own judgment.
It’d perhaps be interesting if people produced a list of the most important/common practical considerations in either direction, though ofc much of that will be specific to the individual/subject matter/specific situation.