I basically agree. A heuristic lets System 1 function without invoking (the much slower) System 2. We need heuristics to get through the day; we couldn’t function if we had to reason out every single behavior we implement. A bias is a heuristic when it’s dysfunctional, resulting in a poorly-chosen System 1 behavior when System 2 could give a significantly better outcome.
One barrier to rationality is that updating one’s heuristics is effortful and often kind of annoying, so we always have some outdated heuristics. The quicker things change, the worse it gets. Too much trust in one’s heuristics risks biased behavior; too little yields indecisiveness.
I basically agree. A heuristic lets System 1 function without invoking (the much slower) System 2. We need heuristics to get through the day; we couldn’t function if we had to reason out every single behavior we implement. A bias is a heuristic when it’s dysfunctional, resulting in a poorly-chosen System 1 behavior when System 2 could give a significantly better outcome.
One barrier to rationality is that updating one’s heuristics is effortful and often kind of annoying, so we always have some outdated heuristics. The quicker things change, the worse it gets. Too much trust in one’s heuristics risks biased behavior; too little yields indecisiveness.