The goal of this problem type would be to train the ability to recognize bias to the point where it becomes second nature, with the hope that this same developed skill would also trigger in your own thought processes.
Part of what rationality is about is that you don’t just hope for beneficial things to happen.
Cognitive bias is a term that comes out of the psychology literature and there were plenty of studies in the domain. It’s my understanding that in academia nobody found that you get very far by teaching people to recognize biases.
Outside of academia, we have CFAR that did think about whether you can get people to be more rational by giving them exercises and came to the conclusion that those exercises should be different.
In a case like this, asking yourself “What evidence do I have that what I hope will actually happen?” and “What sources, be it academic people or experts I might interview, could give me more evidence?” would be much more productive questions than “What things in my thought process might be labeled as biases?”
Part of what rationality is about is that you don’t just hope for beneficial things to happen.
Cognitive bias is a term that comes out of the psychology literature and there were plenty of studies in the domain. It’s my understanding that in academia nobody found that you get very far by teaching people to recognize biases.
Outside of academia, we have CFAR that did think about whether you can get people to be more rational by giving them exercises and came to the conclusion that those exercises should be different.
In a case like this, asking yourself “What evidence do I have that what I hope will actually happen?” and “What sources, be it academic people or experts I might interview, could give me more evidence?” would be much more productive questions than “What things in my thought process might be labeled as biases?”