One view is that people are able to vary their biases, so that they’re more biased when the bias is helpful and less when it gets in the way. For instance, Armor & Sackett (2006) found that people were more optimistic about how they’d perform on hypothetical tasks but relatively accurate in predicting how they’d perform on real tasks that they were actually about to do (when they’d have to face the reality of their actual performance). That’s consistent with near/far theory—in far mode people can afford to have more biased, flattering images of themselves.
With sunk costs, one of the most relevant theories is Gollwitzer’s theory about deliberative and implemental mindsets. He argues that when people are faced with a decision, they tend to go into a deliberative mindset which allows them to view the options relatively accurately so that they can make a good decision. But once they’ve made a decision, their focus shifts to how to implement the chosen course of action—planning how to get it done & figuring out how to overcome any obstacles. Thoughts about whether the course of action really is desirable (or feasible) are distractions and potential demotivators, so people in an implemental mindset mostly don’t think about that, and are overly optimistic when they do.
If rationality training de-compartmentalizes people and makes it harder to switch between different mindsets then that could be a disadvantage. It could be harder to carry out a project if you keep putting yourself back in the mindset of deciding whether or not it’s worth doing.
But it doesn’t have to happen that way—rationality training could train you to respond appropriately to these changes in circumstances. For example, one alternative way of thinking in situations where people are prone to the sunk cost fallacy is to ask yourself if now is a good time to re-evaluate your course of action. Maybe the answer is no, it’s not a good time—you’re in the middle of doing things and it would be a distraction, or you put a lot of thought into your original decision and wouldn’t have the perspective to re-evaluate it now that you’re immersed in the project. In that case you can keep going with the project and trust your original decision, no need to justify it. But sometimes the answer is yes, it is a good time to re-evaluate—circumstances have changed, or you have new information which might have changed your decision if you knew it from the start, or you’re considering whether to make some new commitment which you hadn’t expected. If it is a good time to re-evaluate, then take a step back and take a fresh look at the decision, taking into account all that you’ve learned—no need to be beholden to your old decision.
Armor, D. A., & Sackett, A. M. (2006). Accuracy, error, and bias in predictions for real versus hypothetical events. Journal of Personality and Social Psychology, 91, 583-600.
One view is that people are able to vary their biases, so that they’re more biased when the bias is helpful and less when it gets in the way. For instance, Armor & Sackett (2006) found that people were more optimistic about how they’d perform on hypothetical tasks but relatively accurate in predicting how they’d perform on real tasks that they were actually about to do (when they’d have to face the reality of their actual performance). That’s consistent with near/far theory—in far mode people can afford to have more biased, flattering images of themselves.
With sunk costs, one of the most relevant theories is Gollwitzer’s theory about deliberative and implemental mindsets. He argues that when people are faced with a decision, they tend to go into a deliberative mindset which allows them to view the options relatively accurately so that they can make a good decision. But once they’ve made a decision, their focus shifts to how to implement the chosen course of action—planning how to get it done & figuring out how to overcome any obstacles. Thoughts about whether the course of action really is desirable (or feasible) are distractions and potential demotivators, so people in an implemental mindset mostly don’t think about that, and are overly optimistic when they do.
If rationality training de-compartmentalizes people and makes it harder to switch between different mindsets then that could be a disadvantage. It could be harder to carry out a project if you keep putting yourself back in the mindset of deciding whether or not it’s worth doing.
But it doesn’t have to happen that way—rationality training could train you to respond appropriately to these changes in circumstances. For example, one alternative way of thinking in situations where people are prone to the sunk cost fallacy is to ask yourself if now is a good time to re-evaluate your course of action. Maybe the answer is no, it’s not a good time—you’re in the middle of doing things and it would be a distraction, or you put a lot of thought into your original decision and wouldn’t have the perspective to re-evaluate it now that you’re immersed in the project. In that case you can keep going with the project and trust your original decision, no need to justify it. But sometimes the answer is yes, it is a good time to re-evaluate—circumstances have changed, or you have new information which might have changed your decision if you knew it from the start, or you’re considering whether to make some new commitment which you hadn’t expected. If it is a good time to re-evaluate, then take a step back and take a fresh look at the decision, taking into account all that you’ve learned—no need to be beholden to your old decision.
Armor, D. A., & Sackett, A. M. (2006). Accuracy, error, and bias in predictions for real versus hypothetical events. Journal of Personality and Social Psychology, 91, 583-600.