There are a number of practical issues with most attempts at epistemic modesty/deference, that theoretical approaches do not adequately account for.
1) Misunderstanding of what experts actually mean. It is often easier to defer to a stereotype in your head than to fully understand an expert’s views, or a simple approximation thereof.
Dan Luu gives the example of SV investors who “defer” to economists on the issue of discrimination in competitive markets without actually understanding (or perhaps reading) the relevant papers.
In some of those cases, it’s plausible that you’d do better trusting the evidence of your own eyes/intuition over your attempts to understand experts.
2) Misidentifying the right experts. In the US, it seems like the educated public roughly believes that “anybody with a medical doctorate” is approximately the relevant expert class on questions as diverse as nutrition, the fluid dynamics of indoor air flow (if the airflow happens to carry viruses), and the optimal allocation of limited (medical) resources.
More generally, people often default to the closest high-status group/expert to them, without accounting for whether that group/expert is epistemically superior to other experts slightly further away in space or time.
2a) Immodest modesty.* As a specific case/extension of this, when someone identifies an apparent expert or community of experts to defer to, they risk (incorrectly) believing that they have deference (on this particular topic) “figured out” and thus choose not to update on either object- or meta- level evidence that they did not correctly identify the relevant experts. The issue may be exacerbated beyond “normal” cases of immodesty, if there’s a sufficiently high conviction that you are being epistemically modest!
3) Information lag. Obviously any information you receive is to some degree or another from the past, and has the risk of being outdated. Of course, this lag happens for all evidence you have. At the most trivial level, even sensory experience isn’t really in real-time. But I think it should be reasonable to assume that attempts to read expert claims/consensus is disproportionately likely to have a significant lag problem, compared to your own present evaluations of the object-level arguments.
4) Computational complexity in understanding the consensus. Trying to understand the academic consensus (or lack thereof) from the outside might be very difficult, to the point where establishing your own understanding from a different vantage point might be less time-consuming. Unlike 1), this presupposes that you are able to correctly understand/infer what the experts mean, just that it might not be worth the time to do so.
5) Community issues with groupthink/difficulty in separating out beliefs from action. In an ideal world, we make our independent assessments of a situation, report it to the community, in what Kant calls the “public (scholarly) use of reason” and then defer to an all-things-considered epistemically modest view when we act on our beliefs in our private role as citizens.
However, in practice I think it’s plausibly difficult to separate out what you personally believe from what you feel compelled to act on. One potential issue with this is that a community that’s overly epistemically deferential will plausibly have less variation, and lower affordance for making mistakes.
--
*As a special case of that, people may be unusually bad at identifying the right experts when said experts happen to agree with their initial biases, either on the object-level or for meta-level reasons uncorrelated with truth (eg use similar diction, have similar cultural backgrounds, etc)
Crossposted from an EA Forum comment.
There are a number of practical issues with most attempts at epistemic modesty/deference, that theoretical approaches do not adequately account for.
1) Misunderstanding of what experts actually mean. It is often easier to defer to a stereotype in your head than to fully understand an expert’s views, or a simple approximation thereof.
Dan Luu gives the example of SV investors who “defer” to economists on the issue of discrimination in competitive markets without actually understanding (or perhaps reading) the relevant papers.
In some of those cases, it’s plausible that you’d do better trusting the evidence of your own eyes/intuition over your attempts to understand experts.
2) Misidentifying the right experts. In the US, it seems like the educated public roughly believes that “anybody with a medical doctorate” is approximately the relevant expert class on questions as diverse as nutrition, the fluid dynamics of indoor air flow (if the airflow happens to carry viruses), and the optimal allocation of limited (medical) resources.
More generally, people often default to the closest high-status group/expert to them, without accounting for whether that group/expert is epistemically superior to other experts slightly further away in space or time.
2a) Immodest modesty.* As a specific case/extension of this, when someone identifies an apparent expert or community of experts to defer to, they risk (incorrectly) believing that they have deference (on this particular topic) “figured out” and thus choose not to update on either object- or meta- level evidence that they did not correctly identify the relevant experts. The issue may be exacerbated beyond “normal” cases of immodesty, if there’s a sufficiently high conviction that you are being epistemically modest!
3) Information lag. Obviously any information you receive is to some degree or another from the past, and has the risk of being outdated. Of course, this lag happens for all evidence you have. At the most trivial level, even sensory experience isn’t really in real-time. But I think it should be reasonable to assume that attempts to read expert claims/consensus is disproportionately likely to have a significant lag problem, compared to your own present evaluations of the object-level arguments.
4) Computational complexity in understanding the consensus. Trying to understand the academic consensus (or lack thereof) from the outside might be very difficult, to the point where establishing your own understanding from a different vantage point might be less time-consuming. Unlike 1), this presupposes that you are able to correctly understand/infer what the experts mean, just that it might not be worth the time to do so.
5) Community issues with groupthink/difficulty in separating out beliefs from action. In an ideal world, we make our independent assessments of a situation, report it to the community, in what Kant calls the “public (scholarly) use of reason” and then defer to an all-things-considered epistemically modest view when we act on our beliefs in our private role as citizens.
However, in practice I think it’s plausibly difficult to separate out what you personally believe from what you feel compelled to act on. One potential issue with this is that a community that’s overly epistemically deferential will plausibly have less variation, and lower affordance for making mistakes.
--
*As a special case of that, people may be unusually bad at identifying the right experts when said experts happen to agree with their initial biases, either on the object-level or for meta-level reasons uncorrelated with truth (eg use similar diction, have similar cultural backgrounds, etc)