Cognitive Bias research gives us a long list of situations where System 1 may fail to give you the best answer, giving a biased answer instead. Therefore, learning about cognitive biases teaches one to notice if one is in a dangerous situation where they should “halt” one’s System 1 and offload their decision making to the System 2. Naturally, the next question is, how to train one’s System 1 and System 2 themselves?
How does one train one’s System 1?
If you spend a lot of time analyzing data in your field, you can develop a domain specific intuition. Is it possible to train your “general intuition” (if such thing exists) or your capability to develop domain-specific intuitions more quickly and more reliably? Mathematicians often talk about beauty, elegance as good guiding principles and importance having a good mathematical taste (e.g. Terence Tao briefly mentions it here). But why do some people have better taste than others? How do you train your taste? Is a good taste (or good taste for ideas) an example of intuition that is somewhat less domain specific? Or is it still too domain specific? By the way, is developing a good taste for art or music helpful for strenghtening your “general intuition” (if such things exists)? If so, a taste for what kind of art is the most helpful for aiding the development of the aforementioned “general intuition”?
How to train your System 2?
There is a famous quote by Alfred North Whitehead:
It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle — they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.
It seems that “finding better ways to organize your knowledge” is one way to train your System 2? Coincidentally, the quote also suggests that we improve our thinking by developing reliable ways to offload mental burden to System 1, therefore the quote is not, strictly speaking, just about System 2 (of course, concepts of “System 1″ and “System 2” belong to a map, not a territory).
What are other ways to train your System 2, besides the aforementioned “finding better ways to organize your knowledge” and finding ways to reliably offload some of the work to System 1? Developing axiomatic systems? Learning to use logic and Bayesian inference? Are there any others?
Therefore, learning about cognitive biases teaches one to notice if one is in a dangerous situation where they should “halt” one’s System 1 and offload their decision making to the System 2.
There’s no good reason to believe that things work like that. Conscious attempts to block a bias like the Hindsight bias don’t work.
Is it possible to train your “general intuition” (if such thing exists) or your capability to develop domain-specific intuitions more quickly and more reliably?
“Noticing” seems to be a skill that’s useful whenever you are interacting with intuition. Mindfulness meditation provides also broad benefits for System I processes.
Conscious attempts to block a bias like the Hindsight bias don’t work.
You don’t try to correct for a single bias. What I am saying is that if you find yourself in a situation where making correct decision is important and you know that this is the type of situation where your intuitive thinking is particularly likely to be affected by cognitive biases, you should try to make decision using your System 2. Instead of trying to block a single bias that affects System 1, we try to switch to using System 2 (of course, we have to pay a price). For example, suppose you are in a situation where you have to figure out whether a person X was negligent by failing to prepare for a disaster even if they did receive some information that something bad was coming. You notice that this is the kind of case where your ability to make a correct decision is likely to suffer from the hindsight bias. Hence you try to eschew your intuitive reasoning, opting to use deliberate reasoning instead. For similar reasons, while they do not avoid biases altogether, trials by court are strongly preferable to trials by a mob or trials by media as the latter usually do not even attempt to curb their biases, whereas the structure of the former encourages using the type of thinking that employs System 2.
Of course, switching to using System 2 doesn’t guarantee that one’s decision will be correct. And in some cases, switching from System 1 to System 2 might not be possible.
You notice that this is the kind of case where your ability to make a correct decision is likely to suffer from the hindsight bias. Hence you try to eschew your intuitive reasoning, opting to use deliberate reasoning instead.
And that doesn’t work for most people. From Elizers article on the Hindsight bias:
Kamin and Rachlinski (1995) asked two groups to estimate the probability of flood damage caused by blockage of a city-owned drawbridge. The control group was told only the background information known to the city when it decided not to hire a bridge watcher. The experimental group was given this information, plus the fact that a flood had actually occurred. Instructions stated the city was negligent if the foreseeable probability of flooding was greater than 10%. 76% of the control group concluded the flood was so unlikely that no precautions were necessary; 57% of the experimental group concluded the flood was so likely that failure to take precautions was legally negligent. A third experimental group was told the outcome and also explicitly instructed to avoid hindsight bias, which made no difference: 56% concluded the city was legally negligent
You can’t escape the bias by simply making a decision to go to system II.
For similar reasons, while they do not avoid biases altogether, trials by court are strongly preferable to trials by a mob or trials by media as the latter usually do not even attempt to curb their biases, whereas the structure of the former encourages using the type of thinking that employs System 2.
I think you confuse cognitive bias with bias interests. A journalist who writes an article does think about the issue with system II. It’s just that a fair trial isn’t his goal.
You can’t escape the bias by simply making a decision to go to system II
System 2 is not enough. But if there is a straw of hope to mitigate the bias in a particular situation where finding a correct solution is important, the only chance to find it is to search for it using your System 2. Suppose have to decide the foreseeable probability of flooding, you know about hindsight bias and you actually care about finding the correct estimate. How should you proceed? Perhaps you have to devise a method that you will use to make estimation in advance of looking at the date. Or perhaps you will make a decision to obtain data from various cities in the same region and use logistic regression to make an estimate. Or perhaps you will use some other estimator. What I am trying to say is that all these methods of deliberate reasoning (decision making algorithms, statistical analysis, etc.) are executed by System 2 and not System 1. I am not trying to say that System 2 guarantees that we will avoid bias. Firstly, in my understanding, System 2 and System 1 aren’t separate systems, they are merely two ends of the continuum. Secondly, just because an algorithm is executed by System 2 doesn’t mean that that algorithm is good.
As I have said, System 2 seems to be close to necessary, but obviously it is not sufficient (for example, rolling a dice to “determine” the probability of flooding doesn’t rely on intuition). Algorithms that are executed by System 2 are usually somewhat more transparent, therefore it is easier to detect their mistakes and biases. This means that it is easier to fix them. Thus there is a chance that at least some of those algorithms are good enough to be good estimators and avoid biases.
Transparency is what makes System 2 preferable to System 1 in this particular situation. In other types of situations or other types of questions, as dthunt noted, feedback loops can be useful to train your intuition to achieve greater accuracy even though intuitive reasoning itself is not necessarily transparent.
System 2 is not enough. But if there is a straw of hope to mitigate the bias in a particular situation where finding a correct solution is important, the only chance to find it is to search for it using your System 2.
No, you can also take a good night sleep to give your System I more time to anaylse the situation to improve it’s output.
You can also do exercises to calibrate your credence. Calibration training for probability estimates is probably one of the best ways to get them right.
That’s not hindsight bias. Having a flood is Bayseian evidence in favor of the correct estimate of flooding having been high (and therefore in favor of negligence).
Yes, but it is hindsight bias to give that a lot of weight. (This is a common thing for lots of biases: they involve things which are strictly speaking actual Bayesian evidence but where humans frequently massively over-update based on them.)
If you have some sort of decision-making process you do a lot that you expect is going to become a thing you build intuition around later, make sure you have the right feedback loops in place, so that you have something to help keep that intuition calibrated. (This also applies to processes you engineer for others.)
Cognitive Bias research gives us a long list of situations where System 1 may fail to give you the best answer, giving a biased answer instead. Therefore, learning about cognitive biases teaches one to notice if one is in a dangerous situation where they should “halt” one’s System 1 and offload their decision making to the System 2. Naturally, the next question is, how to train one’s System 1 and System 2 themselves?
How does one train one’s System 1? If you spend a lot of time analyzing data in your field, you can develop a domain specific intuition. Is it possible to train your “general intuition” (if such thing exists) or your capability to develop domain-specific intuitions more quickly and more reliably? Mathematicians often talk about beauty, elegance as good guiding principles and importance having a good mathematical taste (e.g. Terence Tao briefly mentions it here). But why do some people have better taste than others? How do you train your taste? Is a good taste (or good taste for ideas) an example of intuition that is somewhat less domain specific? Or is it still too domain specific? By the way, is developing a good taste for art or music helpful for strenghtening your “general intuition” (if such things exists)? If so, a taste for what kind of art is the most helpful for aiding the development of the aforementioned “general intuition”?
Is System 1 simply a shorthand for “everything that isn’t System 2″ and intuition is a shorthand for ” the ability to acquire knowledge without inference or the use of reason”, thus you cannot train your System 1 in general, you can only train specific parts of it?
How to train your System 2? There is a famous quote by Alfred North Whitehead:
It seems that “finding better ways to organize your knowledge” is one way to train your System 2? Coincidentally, the quote also suggests that we improve our thinking by developing reliable ways to offload mental burden to System 1, therefore the quote is not, strictly speaking, just about System 2 (of course, concepts of “System 1″ and “System 2” belong to a map, not a territory). What are other ways to train your System 2, besides the aforementioned “finding better ways to organize your knowledge” and finding ways to reliably offload some of the work to System 1? Developing axiomatic systems? Learning to use logic and Bayesian inference? Are there any others?
There’s no good reason to believe that things work like that. Conscious attempts to block a bias like the Hindsight bias don’t work.
“Noticing” seems to be a skill that’s useful whenever you are interacting with intuition. Mindfulness meditation provides also broad benefits for System I processes.
You don’t try to correct for a single bias. What I am saying is that if you find yourself in a situation where making correct decision is important and you know that this is the type of situation where your intuitive thinking is particularly likely to be affected by cognitive biases, you should try to make decision using your System 2. Instead of trying to block a single bias that affects System 1, we try to switch to using System 2 (of course, we have to pay a price). For example, suppose you are in a situation where you have to figure out whether a person X was negligent by failing to prepare for a disaster even if they did receive some information that something bad was coming. You notice that this is the kind of case where your ability to make a correct decision is likely to suffer from the hindsight bias. Hence you try to eschew your intuitive reasoning, opting to use deliberate reasoning instead. For similar reasons, while they do not avoid biases altogether, trials by court are strongly preferable to trials by a mob or trials by media as the latter usually do not even attempt to curb their biases, whereas the structure of the former encourages using the type of thinking that employs System 2.
Of course, switching to using System 2 doesn’t guarantee that one’s decision will be correct. And in some cases, switching from System 1 to System 2 might not be possible.
And that doesn’t work for most people. From Elizers article on the Hindsight bias:
You can’t escape the bias by simply making a decision to go to system II.
I think you confuse cognitive bias with bias interests. A journalist who writes an article does think about the issue with system II. It’s just that a fair trial isn’t his goal.
System 2 is not enough. But if there is a straw of hope to mitigate the bias in a particular situation where finding a correct solution is important, the only chance to find it is to search for it using your System 2. Suppose have to decide the foreseeable probability of flooding, you know about hindsight bias and you actually care about finding the correct estimate. How should you proceed? Perhaps you have to devise a method that you will use to make estimation in advance of looking at the date. Or perhaps you will make a decision to obtain data from various cities in the same region and use logistic regression to make an estimate. Or perhaps you will use some other estimator. What I am trying to say is that all these methods of deliberate reasoning (decision making algorithms, statistical analysis, etc.) are executed by System 2 and not System 1. I am not trying to say that System 2 guarantees that we will avoid bias. Firstly, in my understanding, System 2 and System 1 aren’t separate systems, they are merely two ends of the continuum. Secondly, just because an algorithm is executed by System 2 doesn’t mean that that algorithm is good.
As I have said, System 2 seems to be close to necessary, but obviously it is not sufficient (for example, rolling a dice to “determine” the probability of flooding doesn’t rely on intuition). Algorithms that are executed by System 2 are usually somewhat more transparent, therefore it is easier to detect their mistakes and biases. This means that it is easier to fix them. Thus there is a chance that at least some of those algorithms are good enough to be good estimators and avoid biases.
Transparency is what makes System 2 preferable to System 1 in this particular situation. In other types of situations or other types of questions, as dthunt noted, feedback loops can be useful to train your intuition to achieve greater accuracy even though intuitive reasoning itself is not necessarily transparent.
No, you can also take a good night sleep to give your System I more time to anaylse the situation to improve it’s output.
You can also do exercises to calibrate your credence. Calibration training for probability estimates is probably one of the best ways to get them right.
That’s not hindsight bias. Having a flood is Bayseian evidence in favor of the correct estimate of flooding having been high (and therefore in favor of negligence).
Yes, but it is hindsight bias to give that a lot of weight. (This is a common thing for lots of biases: they involve things which are strictly speaking actual Bayesian evidence but where humans frequently massively over-update based on them.)
If you have some sort of decision-making process you do a lot that you expect is going to become a thing you build intuition around later, make sure you have the right feedback loops in place, so that you have something to help keep that intuition calibrated. (This also applies to processes you engineer for others.)