I thought rational ignorance was a part of bounded rationality—people do not investigate every contingency because they do not have the computational power to do so, and thus their decision-making is bounded by their computational power.
You have distinguished this from motivated cognition, in which people succumb to confirmation bias, seeing only what they want to see. But isn’t a bias just a heuristic, misapplied? And isn’t a heuristic a device for coping with limited computational capacity? It seems that a bias is just a manifestation of bounded rationality, and that this includes confirmation bias and thus motivated cognition.
Yes, bounded-rationality and rational ignorance are consequnces of the limits of human computational power. But humans have more than enough computational power to do better than in-group bias, anchoring effects, deciding when to follow authority simply because it is authority, or believing something because we want it to be true.
We’ve had that capacity since the recorded history began, but ordinary people tend to not notice that they are not considering all the possibilities. By contrast, it’s not uncommon for people to realize that they lack some relevant knowledge. Which isn’t to say that realization is common or easy to get people to admit, but it seems possible to change, which is much less clear for cognitive bias.
I thought rational ignorance was a part of bounded rationality—people do not investigate every contingency because they do not have the computational power to do so, and thus their decision-making is bounded by their computational power.
You have distinguished this from motivated cognition, in which people succumb to confirmation bias, seeing only what they want to see. But isn’t a bias just a heuristic, misapplied? And isn’t a heuristic a device for coping with limited computational capacity? It seems that a bias is just a manifestation of bounded rationality, and that this includes confirmation bias and thus motivated cognition.
Yes, bounded-rationality and rational ignorance are consequnces of the limits of human computational power. But humans have more than enough computational power to do better than in-group bias, anchoring effects, deciding when to follow authority simply because it is authority, or believing something because we want it to be true.
We’ve had that capacity since the recorded history began, but ordinary people tend to not notice that they are not considering all the possibilities. By contrast, it’s not uncommon for people to realize that they lack some relevant knowledge. Which isn’t to say that realization is common or easy to get people to admit, but it seems possible to change, which is much less clear for cognitive bias.