Therefore, learning about cognitive biases teaches one to notice if one is in a dangerous situation where they should “halt” one’s System 1 and offload their decision making to the System 2.
There’s no good reason to believe that things work like that. Conscious attempts to block a bias like the Hindsight bias don’t work.
Is it possible to train your “general intuition” (if such thing exists) or your capability to develop domain-specific intuitions more quickly and more reliably?
“Noticing” seems to be a skill that’s useful whenever you are interacting with intuition. Mindfulness meditation provides also broad benefits for System I processes.
Conscious attempts to block a bias like the Hindsight bias don’t work.
You don’t try to correct for a single bias. What I am saying is that if you find yourself in a situation where making correct decision is important and you know that this is the type of situation where your intuitive thinking is particularly likely to be affected by cognitive biases, you should try to make decision using your System 2. Instead of trying to block a single bias that affects System 1, we try to switch to using System 2 (of course, we have to pay a price). For example, suppose you are in a situation where you have to figure out whether a person X was negligent by failing to prepare for a disaster even if they did receive some information that something bad was coming. You notice that this is the kind of case where your ability to make a correct decision is likely to suffer from the hindsight bias. Hence you try to eschew your intuitive reasoning, opting to use deliberate reasoning instead. For similar reasons, while they do not avoid biases altogether, trials by court are strongly preferable to trials by a mob or trials by media as the latter usually do not even attempt to curb their biases, whereas the structure of the former encourages using the type of thinking that employs System 2.
Of course, switching to using System 2 doesn’t guarantee that one’s decision will be correct. And in some cases, switching from System 1 to System 2 might not be possible.
You notice that this is the kind of case where your ability to make a correct decision is likely to suffer from the hindsight bias. Hence you try to eschew your intuitive reasoning, opting to use deliberate reasoning instead.
And that doesn’t work for most people. From Elizers article on the Hindsight bias:
Kamin and Rachlinski (1995) asked two groups to estimate the probability of flood damage caused by blockage of a city-owned drawbridge. The control group was told only the background information known to the city when it decided not to hire a bridge watcher. The experimental group was given this information, plus the fact that a flood had actually occurred. Instructions stated the city was negligent if the foreseeable probability of flooding was greater than 10%. 76% of the control group concluded the flood was so unlikely that no precautions were necessary; 57% of the experimental group concluded the flood was so likely that failure to take precautions was legally negligent. A third experimental group was told the outcome and also explicitly instructed to avoid hindsight bias, which made no difference: 56% concluded the city was legally negligent
You can’t escape the bias by simply making a decision to go to system II.
For similar reasons, while they do not avoid biases altogether, trials by court are strongly preferable to trials by a mob or trials by media as the latter usually do not even attempt to curb their biases, whereas the structure of the former encourages using the type of thinking that employs System 2.
I think you confuse cognitive bias with bias interests. A journalist who writes an article does think about the issue with system II. It’s just that a fair trial isn’t his goal.
You can’t escape the bias by simply making a decision to go to system II
System 2 is not enough. But if there is a straw of hope to mitigate the bias in a particular situation where finding a correct solution is important, the only chance to find it is to search for it using your System 2. Suppose have to decide the foreseeable probability of flooding, you know about hindsight bias and you actually care about finding the correct estimate. How should you proceed? Perhaps you have to devise a method that you will use to make estimation in advance of looking at the date. Or perhaps you will make a decision to obtain data from various cities in the same region and use logistic regression to make an estimate. Or perhaps you will use some other estimator. What I am trying to say is that all these methods of deliberate reasoning (decision making algorithms, statistical analysis, etc.) are executed by System 2 and not System 1. I am not trying to say that System 2 guarantees that we will avoid bias. Firstly, in my understanding, System 2 and System 1 aren’t separate systems, they are merely two ends of the continuum. Secondly, just because an algorithm is executed by System 2 doesn’t mean that that algorithm is good.
As I have said, System 2 seems to be close to necessary, but obviously it is not sufficient (for example, rolling a dice to “determine” the probability of flooding doesn’t rely on intuition). Algorithms that are executed by System 2 are usually somewhat more transparent, therefore it is easier to detect their mistakes and biases. This means that it is easier to fix them. Thus there is a chance that at least some of those algorithms are good enough to be good estimators and avoid biases.
Transparency is what makes System 2 preferable to System 1 in this particular situation. In other types of situations or other types of questions, as dthunt noted, feedback loops can be useful to train your intuition to achieve greater accuracy even though intuitive reasoning itself is not necessarily transparent.
System 2 is not enough. But if there is a straw of hope to mitigate the bias in a particular situation where finding a correct solution is important, the only chance to find it is to search for it using your System 2.
No, you can also take a good night sleep to give your System I more time to anaylse the situation to improve it’s output.
You can also do exercises to calibrate your credence. Calibration training for probability estimates is probably one of the best ways to get them right.
That’s not hindsight bias. Having a flood is Bayseian evidence in favor of the correct estimate of flooding having been high (and therefore in favor of negligence).
Yes, but it is hindsight bias to give that a lot of weight. (This is a common thing for lots of biases: they involve things which are strictly speaking actual Bayesian evidence but where humans frequently massively over-update based on them.)
There’s no good reason to believe that things work like that. Conscious attempts to block a bias like the Hindsight bias don’t work.
“Noticing” seems to be a skill that’s useful whenever you are interacting with intuition. Mindfulness meditation provides also broad benefits for System I processes.
You don’t try to correct for a single bias. What I am saying is that if you find yourself in a situation where making correct decision is important and you know that this is the type of situation where your intuitive thinking is particularly likely to be affected by cognitive biases, you should try to make decision using your System 2. Instead of trying to block a single bias that affects System 1, we try to switch to using System 2 (of course, we have to pay a price). For example, suppose you are in a situation where you have to figure out whether a person X was negligent by failing to prepare for a disaster even if they did receive some information that something bad was coming. You notice that this is the kind of case where your ability to make a correct decision is likely to suffer from the hindsight bias. Hence you try to eschew your intuitive reasoning, opting to use deliberate reasoning instead. For similar reasons, while they do not avoid biases altogether, trials by court are strongly preferable to trials by a mob or trials by media as the latter usually do not even attempt to curb their biases, whereas the structure of the former encourages using the type of thinking that employs System 2.
Of course, switching to using System 2 doesn’t guarantee that one’s decision will be correct. And in some cases, switching from System 1 to System 2 might not be possible.
And that doesn’t work for most people. From Elizers article on the Hindsight bias:
You can’t escape the bias by simply making a decision to go to system II.
I think you confuse cognitive bias with bias interests. A journalist who writes an article does think about the issue with system II. It’s just that a fair trial isn’t his goal.
System 2 is not enough. But if there is a straw of hope to mitigate the bias in a particular situation where finding a correct solution is important, the only chance to find it is to search for it using your System 2. Suppose have to decide the foreseeable probability of flooding, you know about hindsight bias and you actually care about finding the correct estimate. How should you proceed? Perhaps you have to devise a method that you will use to make estimation in advance of looking at the date. Or perhaps you will make a decision to obtain data from various cities in the same region and use logistic regression to make an estimate. Or perhaps you will use some other estimator. What I am trying to say is that all these methods of deliberate reasoning (decision making algorithms, statistical analysis, etc.) are executed by System 2 and not System 1. I am not trying to say that System 2 guarantees that we will avoid bias. Firstly, in my understanding, System 2 and System 1 aren’t separate systems, they are merely two ends of the continuum. Secondly, just because an algorithm is executed by System 2 doesn’t mean that that algorithm is good.
As I have said, System 2 seems to be close to necessary, but obviously it is not sufficient (for example, rolling a dice to “determine” the probability of flooding doesn’t rely on intuition). Algorithms that are executed by System 2 are usually somewhat more transparent, therefore it is easier to detect their mistakes and biases. This means that it is easier to fix them. Thus there is a chance that at least some of those algorithms are good enough to be good estimators and avoid biases.
Transparency is what makes System 2 preferable to System 1 in this particular situation. In other types of situations or other types of questions, as dthunt noted, feedback loops can be useful to train your intuition to achieve greater accuracy even though intuitive reasoning itself is not necessarily transparent.
No, you can also take a good night sleep to give your System I more time to anaylse the situation to improve it’s output.
You can also do exercises to calibrate your credence. Calibration training for probability estimates is probably one of the best ways to get them right.
That’s not hindsight bias. Having a flood is Bayseian evidence in favor of the correct estimate of flooding having been high (and therefore in favor of negligence).
Yes, but it is hindsight bias to give that a lot of weight. (This is a common thing for lots of biases: they involve things which are strictly speaking actual Bayesian evidence but where humans frequently massively over-update based on them.)