This is VERY interesting. I’m as baffled as you are, sorry to say.
It seems like you’ve described rationalizations that prevent true (or ‘maximally accurate’) beliefs. Have you tried asking these case studies their rationales for decision-making? One theme of my rationalization factory is spitting out true but misleading reasons for doing things, rarely allowing me to reason out doing what I know—somehow—that I should. Said factory operates by preventing me from thinking certain thoughts. Perhaps this goes on in these people?
I’ve performed one heck of an update thanks to your comment and realizing that I was generalizing from only a few examples.
This is VERY interesting. I’m as baffled as you are, sorry to say.
It seems like you’ve described rationalizations that prevent true (or ‘maximally accurate’) beliefs. Have you tried asking these case studies their rationales for decision-making? One theme of my rationalization factory is spitting out true but misleading reasons for doing things, rarely allowing me to reason out doing what I know—somehow—that I should. Said factory operates by preventing me from thinking certain thoughts. Perhaps this goes on in these people?
I’ve performed one heck of an update thanks to your comment and realizing that I was generalizing from only a few examples.