How do we know it’s a problem? In a world where we can’t make perfect (or even very good) point-predictions, and on most topics don’t have the ability to formalize a model of probability space, what is the proper level of optimism (picking more pleasant examples for the inevitable availability heuristic we will experience)? Your important question at the end seems like the right one to be asking: how can a realist improve the situation?
And it starts with defining “improve”. For a lot of cases, optimism is the only way to actually start any ambitious project—the realist option is to maintain the status quo, which is not clearly better than taking a risk of failure.
I often wonder if optimism bias is a cultural reaction to other biases, like loss aversion and the drive to conformity. If so, we’ll need to address those at the same time, or we’re moving AWAY from truth by removing only one side of the bias equilibrium.
How do we know it’s a problem? In a world where we can’t make perfect (or even very good) point-predictions, and on most topics don’t have the ability to formalize a model of probability space, what is the proper level of optimism (picking more pleasant examples for the inevitable availability heuristic we will experience)? Your important question at the end seems like the right one to be asking: how can a realist improve the situation?
And it starts with defining “improve”. For a lot of cases, optimism is the only way to actually start any ambitious project—the realist option is to maintain the status quo, which is not clearly better than taking a risk of failure.
I often wonder if optimism bias is a cultural reaction to other biases, like loss aversion and the drive to conformity. If so, we’ll need to address those at the same time, or we’re moving AWAY from truth by removing only one side of the bias equilibrium.