What of those who act hypocritically in that something they profess doesn’t match certain actions—but simply haven’t yet become aware of the hypocrisy? Theirs isn’t a conscious decision to choose illusion over reality, and neither is their self-deception born of akrasia, this weakness of will you speak of. This belief and this action remain non-overlapping magisteria in their minds, to steal from Gould.
Take the example of a person who professes to seek and embrace truth whereever they find it. Jolly good for them! But they don’t take the next step of acting on this belief to search out and test other ‘truths’ they espouse, and so they remain an active participant in a cult, even going so far as to say they believe in the cult because it is true. Clearly there is some kind of self-deception going on here, but how could one recognize it, and where is this bias or heuristic? It’s not deliberate hypocrisy nor akrasia. Is it simply a lack of follow-through or is there more to it? Someone could go their whole life without realizing they hold two contradictory ideas or that their actions contradict something they profess.
At some future point this individual could have a crisis. They profess to believe in the cult because it is true. They learn that some claim made by the cult is untrue. Now they must pick between what were once two non-overlapping magisteria; pick the cult by BSing some other reason and continue professing it despite contrary evidence, or pick the other claim of seeking out truth.
I don’t know where I’m going with this. I’d be interested to see your take on this though. How much can a lack of putting two and two together be a form of self-deception? I see how some organizations or ideas could stand to profit by isolating themselves from investigative thought. Say you’re taught all your life that communism, or capitalism, or religion, or atheism, or pork is bad. This claim could be true, could be false, or somewhere in between. Plenty of people go their whole life never thinking of challenging the claim, of looking to verify it. The thought never crosses their mind. How can we combat this tendency to just accept something and never give it a second thought, to never realize how one idea or claim could be contradictory to a different one we hold? Great articles! I’m highly enjoying them.
Plenty of people go their whole life never thinking of challenging the claim, of looking to verify it. The thought never crosses their mind.
Do you have any evidence that this is the case? In my experience, it’s not. Most people tend to feel that coming up with an argument for why their particular beliefs are true is extremely important, it’s just that they then fall into patterns of confirmation/disconfirmation bias, affect bias, and mind-projection fallacy. A person raised in a fundamentalist Christian home is taught that there are really true and verifiable explanations for the specifics of why evolution is wrong. There’s a huge propensity to find ambiguity intolerable. The more dogmatically an assertion is held, the more intolerable ambiguity becomes.
The cases where error exists due to sheer lack of curiosity don’t appear to be common. But in those cases, maybe a more fundamental question to ask is why there is a lack of curiosity. Perhaps it is social convention to accept an idea and it would be uncomfortable to challenge it. Perhaps adherence to a particular idea has provided a reasonable amount of comfort and consistency in a tribe’s history and if they cannot imagine increases in their success and comfort, there is little motivation to stray from the accepted dogma. These types of question cast light on what this sort of mistake is, which is different from hypocrisy/akrasia. I’m just not sure this question is (a) an important distinction or (b) worth adding on to this particular thread. It might be worth creating a new discussion thread for it though, to get more thoughts than just my own.
What of those who act hypocritically in that something they profess doesn’t match certain actions—but simply haven’t yet become aware of the hypocrisy? Theirs isn’t a conscious decision to choose illusion over reality, and neither is their self-deception born of akrasia, this weakness of will you speak of. This belief and this action remain non-overlapping magisteria in their minds, to steal from Gould.
Take the example of a person who professes to seek and embrace truth whereever they find it. Jolly good for them! But they don’t take the next step of acting on this belief to search out and test other ‘truths’ they espouse, and so they remain an active participant in a cult, even going so far as to say they believe in the cult because it is true. Clearly there is some kind of self-deception going on here, but how could one recognize it, and where is this bias or heuristic? It’s not deliberate hypocrisy nor akrasia. Is it simply a lack of follow-through or is there more to it? Someone could go their whole life without realizing they hold two contradictory ideas or that their actions contradict something they profess.
At some future point this individual could have a crisis. They profess to believe in the cult because it is true. They learn that some claim made by the cult is untrue. Now they must pick between what were once two non-overlapping magisteria; pick the cult by BSing some other reason and continue professing it despite contrary evidence, or pick the other claim of seeking out truth.
I don’t know where I’m going with this. I’d be interested to see your take on this though. How much can a lack of putting two and two together be a form of self-deception? I see how some organizations or ideas could stand to profit by isolating themselves from investigative thought. Say you’re taught all your life that communism, or capitalism, or religion, or atheism, or pork is bad. This claim could be true, could be false, or somewhere in between. Plenty of people go their whole life never thinking of challenging the claim, of looking to verify it. The thought never crosses their mind. How can we combat this tendency to just accept something and never give it a second thought, to never realize how one idea or claim could be contradictory to a different one we hold? Great articles! I’m highly enjoying them.
Do you have any evidence that this is the case? In my experience, it’s not. Most people tend to feel that coming up with an argument for why their particular beliefs are true is extremely important, it’s just that they then fall into patterns of confirmation/disconfirmation bias, affect bias, and mind-projection fallacy. A person raised in a fundamentalist Christian home is taught that there are really true and verifiable explanations for the specifics of why evolution is wrong. There’s a huge propensity to find ambiguity intolerable. The more dogmatically an assertion is held, the more intolerable ambiguity becomes.
The cases where error exists due to sheer lack of curiosity don’t appear to be common. But in those cases, maybe a more fundamental question to ask is why there is a lack of curiosity. Perhaps it is social convention to accept an idea and it would be uncomfortable to challenge it. Perhaps adherence to a particular idea has provided a reasonable amount of comfort and consistency in a tribe’s history and if they cannot imagine increases in their success and comfort, there is little motivation to stray from the accepted dogma. These types of question cast light on what this sort of mistake is, which is different from hypocrisy/akrasia. I’m just not sure this question is (a) an important distinction or (b) worth adding on to this particular thread. It might be worth creating a new discussion thread for it though, to get more thoughts than just my own.