One important aspect of our lives as we search for knowledge is knowing what and who to defer to, as we usually must take a lot of our knowledge on faith in expertise. However, how should you defer on issues, given that certain areas could be vastly wrong?
Well, I’ll introduce some heuristics from Chris Hallquist that might help you to defer better.
They can be used in the following ways:
When an EA defers to a non-EA expert, or the movement as a whole defers to non-EA expertise.
When a less-knowledgeable EA defers to a more knowledgeable EA on something.
When someone outside a field defers to an insider expert.
Now, before I begin, I want to list caveats here:
The heuristic only applies to non-moral fields.
The heuristic assumes the field is sound. In an upcoming post, I’ll talk about signs a field may have unsound bases, and what to expect there.
It’s not a replacement for EV calculations.
If you’re in a field or plan to work in a cause area, it’s best to replace this heuristic with this post by Emrik: The underappreciated value of original thinking below the frontier.
When the data show an overwhelming consensus in favor of one view (say, if the number of dissenters is less than the Lizardman’s Constant), this almost always ought to swamp any other evidence a non-expert might think they have regarding the issue.
When a strong but not overwhelming majority of experts favor one view, non-experts should take this as strong evidence in favor of that view, but there’s a greater chance that evidence could be overcome by other evidence (even from a non-expert’s point of view).
When there is only barely a majority view among experts, or no agreement at all, this is much less informative than the previous two conditions. It may indicate agnosticism is the appropriate attitude, but in many cases non-experts needn’t hesitate before having their own opinion.
Expert opinion should be discounted when their opinions could be predicted solely from information not relevant to the truth of the claims. This may be the only reliable, easy heuristic a non-expert can use to figure out a particular group of experts should not be trusted.
What about selection bias?
Emrik raised a concern about deferring to experts in that the most informed people are also selection biased to believe that their field is sound, from his post here: The Paradox of Expert Opinion, link below.
This is why it’s so rare for the 1st, strongest condition to hold in practice. Not always, but unless selection effects are controlled for, it’s going to produce wrong results.
So what’s next?
This is hopefully a useful resource so that you can defer quite a bit better and with better reasons than before this post.
When should you defer to expertise? A useful heuristic (Crosspost from EA forum)
Link post
One important aspect of our lives as we search for knowledge is knowing what and who to defer to, as we usually must take a lot of our knowledge on faith in expertise. However, how should you defer on issues, given that certain areas could be vastly wrong?
Well, I’ll introduce some heuristics from Chris Hallquist that might help you to defer better.
They can be used in the following ways:
When an EA defers to a non-EA expert, or the movement as a whole defers to non-EA expertise.
When a less-knowledgeable EA defers to a more knowledgeable EA on something.
When someone outside a field defers to an insider expert.
Now, before I begin, I want to list caveats here:
The heuristic only applies to non-moral fields.
The heuristic assumes the field is sound. In an upcoming post, I’ll talk about signs a field may have unsound bases, and what to expect there.
It’s not a replacement for EV calculations.
If you’re in a field or plan to work in a cause area, it’s best to replace this heuristic with this post by Emrik: The underappreciated value of original thinking below the frontier.
https://www.lesswrong.com/posts/KmkZriGwkn2vDx8gB/the-underappreciated-value-of-original-thinking-below-the
But let’s begin.
Conclusion
What about selection bias?
Emrik raised a concern about deferring to experts in that the most informed people are also selection biased to believe that their field is sound, from his post here: The Paradox of Expert Opinion, link below.
https://www.lesswrong.com/posts/S6Qcf5EgX5zAozTAa/the-paradox-of-expert-opinion
This is why it’s so rare for the 1st, strongest condition to hold in practice. Not always, but unless selection effects are controlled for, it’s going to produce wrong results.
So what’s next? This is hopefully a useful resource so that you can defer quite a bit better and with better reasons than before this post.