A simpler maxim is to pay attention to (fixing) cognitive errors at all times, without excuses. Correctness of a prediction is potentially useful data, but it’s also an excuse for overlooking flaws in the prediction procedure.
Fixing a flaw in a procedure (behavior, skill, intuition) is a task that’s separate from updating the prediction. Predictions need updating simply as a matter of recycling cached thoughts, which might be useful even if you are not fixing any particular reasoning error.
There is also a more subtle kind of cognitive errors where you can correctly solve a problem using the right systematic/conscious/formal method, but your intuition (maybe just one of the relevant intuitive mental models) is out of tune with this process. It’s useful to have at your disposal a somewhat reliable intuitive guidance in thinking about a problem (it can be crucial in forming a plan for solving it), so when a discrepancy appears, it means that there is a bug either in the intuition or in the more formal procedure, which motivates looking into the discrepancy in more detail, and fixing the bug.
A simpler maxim is to pay attention to (fixing) cognitive errors at all times, without excuses. Correctness of a prediction is potentially useful data, but it’s also an excuse for overlooking flaws in the prediction procedure.
But what are the (potential) flaws in the prediction procedure? The only way to figure that out is to see which cognitive behaviors lead to accuracy, and which lead to error. It is all very well to say that we should not perform cognitive errors, but that does not help us with the problem, because what a cognitive error is, is defined by it leading us away from the truth.
A simpler maxim is to pay attention to (fixing) cognitive errors at all times, without excuses. Correctness of a prediction is potentially useful data, but it’s also an excuse for overlooking flaws in the prediction procedure.
Fixing a flaw in a procedure (behavior, skill, intuition) is a task that’s separate from updating the prediction. Predictions need updating simply as a matter of recycling cached thoughts, which might be useful even if you are not fixing any particular reasoning error.
There is also a more subtle kind of cognitive errors where you can correctly solve a problem using the right systematic/conscious/formal method, but your intuition (maybe just one of the relevant intuitive mental models) is out of tune with this process. It’s useful to have at your disposal a somewhat reliable intuitive guidance in thinking about a problem (it can be crucial in forming a plan for solving it), so when a discrepancy appears, it means that there is a bug either in the intuition or in the more formal procedure, which motivates looking into the discrepancy in more detail, and fixing the bug.
But what are the (potential) flaws in the prediction procedure? The only way to figure that out is to see which cognitive behaviors lead to accuracy, and which lead to error. It is all very well to say that we should not perform cognitive errors, but that does not help us with the problem, because what a cognitive error is, is defined by it leading us away from the truth.