A feeling of certainly should usually be regarded as evidence of cognitive bias.
https://www.lesswrong.com/tag/absolute-certainty offers an extreme version of this principle. Absolute certainty requires a standard of evidence so high that we are more likely to have made a mistake than to truly be justified in claiming such certainty.
Less extreme forms of certainty do not necessarily require cognitive bias. For example I feel certain that the Sun will rise over my house in the morning. If challenged, I will admit that the odds are not 100%, and will give an estimate based mostly on the odds of my house not existing in the morning. Most people will accept that both my feeling and my estimate are rationally justifiable. Though maybe with quibbles about how the Sun does not truly rise.
But certainty can be tied to cognitive biases. Cognitive biases make it easy to think some things, and not their opposites. That leads to a feeling of certainty. Now suppose that we commit to the position that we’re certain of. Now evidence against what we believe will cause cognitive dissonance—leading us to reject the evidence. Which means that certainty causes a cognitive bias which reinforces our certainty!
So when we encounter a feeling of certainty in ourselves or others, we should ask whether that certainty is because we have overwhelming evidence, or a cognitive bias. Bayesian reasoning tells us that the stronger the sense of certainty, and the more complex the chain of evidence to the conclusion, the more likely it is that we have encountered cognitive bias. Therefore, a feeling of certainty is evidence of cognitive bias.
To be clear, it always feels that your certainty is based on evidence. Which is why it is useful to realize that you should be extra suspicious of cognitive bias.
That’s a nice sounding theory, but can I provide data suggesting that it really happens that way?
First, let’s turn to Philip Tetlock’s book Expert Political Judgement. Tetlock started with a group of bona fide experts. He asked them questions about their credentials and beliefs. And then asked them to make predictions about the future. Following Isaiah Berlin, he then divided them into foxes and hedgehogs with the hedgehogs having a big idea they were certain of, while foxes balanced many approaches.
On average, predictions were basically random. But the mistakes were far from equal. The hedgehogs felt certain, so I would predict that they were thinking poorly. And indeed, their predictions were significantly worse than chance. By contrast, the foxes predicted future events at significantly better than chance would suggest.
But the story doesn’t end there! We would hope that the better thinkers, the foxes, would be rewarded. But it was the opposite. The certain hedgehogs got the bulk of the lucrative pundit positions, such as talk show hosts.
Why? My best guess is that we like outsourcing our thinking. When we encounter someone who is smart, knowledgeable, certain, and says things that we like, it is comfortable to outsource our thinking to them. Surely if we were that smart and learned that much, we’d agree, we’d decide the same. So why go through that work? We can just repeat what they say and sound smart ourselves!
And so we are biased to accept the thinking of people who are probably not thinking well themselves!
I wish that the story ended there. But in https://rationaldino.substack.com/p/too-important-to-be-able-to-think I give a number of example of yet another dynamic leading to the same problem. If I really want to be good at X, it is easy for me to convince myself that I am good at X. Once I’m convinced, I will reject all evidence that I might not be good at X. Because I reject it, I stop improving. Because I stop improving, there’s a pretty good chance that I’m not nearly as good at X as I think I am.
Sadly, this does not simply happen for random values of X. This dynamic is most likely for the things that are most important. Which means that my certainty of doing well was not just evidence of cognitive bias. Had I taken the warning seriously, it would have been evidence of where my cognitive biases were making it likely that I’d make my most painful life mistakes...
But that’s getting far afield of the main point. My main point is to be on the lookout for certainty in yourself and others. When you encounter it, actively look for reasons why that person might have a cognitive bias. Because there is a decent chance that cognitive biases are causing the feeling of certainty.
Doubt Certainty
A feeling of certainly should usually be regarded as evidence of cognitive bias.
https://www.lesswrong.com/tag/absolute-certainty offers an extreme version of this principle. Absolute certainty requires a standard of evidence so high that we are more likely to have made a mistake than to truly be justified in claiming such certainty.
Less extreme forms of certainty do not necessarily require cognitive bias. For example I feel certain that the Sun will rise over my house in the morning. If challenged, I will admit that the odds are not 100%, and will give an estimate based mostly on the odds of my house not existing in the morning. Most people will accept that both my feeling and my estimate are rationally justifiable. Though maybe with quibbles about how the Sun does not truly rise.
But certainty can be tied to cognitive biases. Cognitive biases make it easy to think some things, and not their opposites. That leads to a feeling of certainty. Now suppose that we commit to the position that we’re certain of. Now evidence against what we believe will cause cognitive dissonance—leading us to reject the evidence. Which means that certainty causes a cognitive bias which reinforces our certainty!
So when we encounter a feeling of certainty in ourselves or others, we should ask whether that certainty is because we have overwhelming evidence, or a cognitive bias. Bayesian reasoning tells us that the stronger the sense of certainty, and the more complex the chain of evidence to the conclusion, the more likely it is that we have encountered cognitive bias. Therefore, a feeling of certainty is evidence of cognitive bias.
To be clear, it always feels that your certainty is based on evidence. Which is why it is useful to realize that you should be extra suspicious of cognitive bias.
That’s a nice sounding theory, but can I provide data suggesting that it really happens that way?
First, let’s turn to Philip Tetlock’s book Expert Political Judgement. Tetlock started with a group of bona fide experts. He asked them questions about their credentials and beliefs. And then asked them to make predictions about the future. Following Isaiah Berlin, he then divided them into foxes and hedgehogs with the hedgehogs having a big idea they were certain of, while foxes balanced many approaches.
On average, predictions were basically random. But the mistakes were far from equal. The hedgehogs felt certain, so I would predict that they were thinking poorly. And indeed, their predictions were significantly worse than chance. By contrast, the foxes predicted future events at significantly better than chance would suggest.
But the story doesn’t end there! We would hope that the better thinkers, the foxes, would be rewarded. But it was the opposite. The certain hedgehogs got the bulk of the lucrative pundit positions, such as talk show hosts.
Why? My best guess is that we like outsourcing our thinking. When we encounter someone who is smart, knowledgeable, certain, and says things that we like, it is comfortable to outsource our thinking to them. Surely if we were that smart and learned that much, we’d agree, we’d decide the same. So why go through that work? We can just repeat what they say and sound smart ourselves!
And so we are biased to accept the thinking of people who are probably not thinking well themselves!
I wish that the story ended there. But in https://rationaldino.substack.com/p/too-important-to-be-able-to-think I give a number of example of yet another dynamic leading to the same problem. If I really want to be good at X, it is easy for me to convince myself that I am good at X. Once I’m convinced, I will reject all evidence that I might not be good at X. Because I reject it, I stop improving. Because I stop improving, there’s a pretty good chance that I’m not nearly as good at X as I think I am.
Sadly, this does not simply happen for random values of X. This dynamic is most likely for the things that are most important. Which means that my certainty of doing well was not just evidence of cognitive bias. Had I taken the warning seriously, it would have been evidence of where my cognitive biases were making it likely that I’d make my most painful life mistakes...
But that’s getting far afield of the main point. My main point is to be on the lookout for certainty in yourself and others. When you encounter it, actively look for reasons why that person might have a cognitive bias. Because there is a decent chance that cognitive biases are causing the feeling of certainty.