We might also need an exercise just for getting people to understand the concept of motivated cognition at all.
“Motivated cognition” in the first place seems like a poor label because most thinking is motivated. It’s redundant and arguing against “motivated cognition” at first glance sounds like arguing against any kind of motivated thinking. That’s problematic because good thinking is also motivated, i.e. “I’ll invent FAI because it would help the world.”
One interesting thing I’ve heard repeated and found to be true about rationalizations is that you can usually get the truth out of someone by asking them a canned line: Is there any other reason? I’m sure this is from some Dale Carnegie book or something, but my guess is that most people don’t feel the need to come up with multiple rationalizations. They usually come up with one “good” one and try to stick with it. “Is there any other reason why you want the cake?” “Well, I also really love chocolate cake.” “Is there any other reason you don’t want to go to [Event] besides being busy?” “Well, my ex will be there, too.” By no means an airtight method but still useful for getting other people to tell you what they’re really thinking.
“If the sky is blue, I want to believe the sky is blue[.]” The question is why people don’t want to believe what is true, and the areas of focus here should be the same as with self-deception and procrastination. Irrational people believe that if they can’t see/hear/feel/recognize/know it, it can’t hurt them—self deception—and even in those moments where some thought tells them this is irrational or that they must eventually face the truth, they put it off—procrastination, preferring the short term over the long.
As kids, I think the experience of “unseeing” or “trying not to see” is common, as we thought of the monster in the dark shadows. On nights when we just shut our eyes and go to sleep there is no monster. On nights where we keep our eyes open and look around, there suddenly is one. So the most important factor in this pre-rational understanding of the world is association—“I looked, there seemed to be a monster; I didn’t look, there didn’t seem to be a monster. Therefore looking determines whether there is a monster.” At some level we recognize that “the monster” is not a monster, but a trick of shadows, a trick of perception. If we don’t look, we also undo the monster itself. This is a detrimental lesson, at this level of understanding. But basically everyone, even kids, recognize this as self-deception. If there is a monster in your closet, crawling under the covers is like making a tasty kid-burrito. If you had actually seen, rather than suspected, a monster, you would run/fight/scream.
Even though someone might logically reason that there isn’t a monster or that there isn’t a god or that their moral views are contradictory, there will still be a compulsion to lie to themselves because it’s easier in the short term. People procrastinate against facing the truth like they procrastinate against writing essays. The kid doesn’t turn on the light because what if the monster is actually real even though he knows it isn’t? He’ll be more scared than he is inside the shelter of covers, even if the monster turns out to be fake, so he prefers to sit right where he is.
The way to deal with this situation is to apply the methods against self-deception and then procrastination. The unfortunate part of that is, it seems some people never get over the hurdle of procrastination.
I would like to suggest “motivational bias” as an alternative name; it is a more accurate description than “motivated cognition”, which is much too general.
I don’t see spending too much time investigating things you don’t want to be true as too bad a problem, a bit wasteful, but that’s it. “Motivated stopping” is more likely to lead you astray, you need to remember to keep questioning things you agree with. In fact, this is a common bias that is often used by con artists, for example, and a good rule is a generalization of the common self-defense against con artists contained in the phrase, “If it looks to good to be true, it probably is.” If you are immediately tempted to accept or agree with an argument, take a second look.
I’m sure this is from some Dale Carnegie book or something, but my guess is that most people don’t feel the need to come up with multiple rationalizations. They usually come up with one “good” one and try to stick with it.
“Motivated cognition” in the first place seems like a poor label because most thinking is motivated. It’s redundant and arguing against “motivated cognition” at first glance sounds like arguing against any kind of motivated thinking. That’s problematic because good thinking is also motivated, i.e. “I’ll invent FAI because it would help the world.”
One interesting thing I’ve heard repeated and found to be true about rationalizations is that you can usually get the truth out of someone by asking them a canned line: Is there any other reason? I’m sure this is from some Dale Carnegie book or something, but my guess is that most people don’t feel the need to come up with multiple rationalizations. They usually come up with one “good” one and try to stick with it. “Is there any other reason why you want the cake?” “Well, I also really love chocolate cake.” “Is there any other reason you don’t want to go to [Event] besides being busy?” “Well, my ex will be there, too.” By no means an airtight method but still useful for getting other people to tell you what they’re really thinking.
“If the sky is blue, I want to believe the sky is blue[.]” The question is why people don’t want to believe what is true, and the areas of focus here should be the same as with self-deception and procrastination. Irrational people believe that if they can’t see/hear/feel/recognize/know it, it can’t hurt them—self deception—and even in those moments where some thought tells them this is irrational or that they must eventually face the truth, they put it off—procrastination, preferring the short term over the long.
As kids, I think the experience of “unseeing” or “trying not to see” is common, as we thought of the monster in the dark shadows. On nights when we just shut our eyes and go to sleep there is no monster. On nights where we keep our eyes open and look around, there suddenly is one. So the most important factor in this pre-rational understanding of the world is association—“I looked, there seemed to be a monster; I didn’t look, there didn’t seem to be a monster. Therefore looking determines whether there is a monster.” At some level we recognize that “the monster” is not a monster, but a trick of shadows, a trick of perception. If we don’t look, we also undo the monster itself. This is a detrimental lesson, at this level of understanding. But basically everyone, even kids, recognize this as self-deception. If there is a monster in your closet, crawling under the covers is like making a tasty kid-burrito. If you had actually seen, rather than suspected, a monster, you would run/fight/scream.
Even though someone might logically reason that there isn’t a monster or that there isn’t a god or that their moral views are contradictory, there will still be a compulsion to lie to themselves because it’s easier in the short term. People procrastinate against facing the truth like they procrastinate against writing essays. The kid doesn’t turn on the light because what if the monster is actually real even though he knows it isn’t? He’ll be more scared than he is inside the shelter of covers, even if the monster turns out to be fake, so he prefers to sit right where he is.
The way to deal with this situation is to apply the methods against self-deception and then procrastination. The unfortunate part of that is, it seems some people never get over the hurdle of procrastination.
I would like to suggest “motivational bias” as an alternative name; it is a more accurate description than “motivated cognition”, which is much too general.
I don’t see spending too much time investigating things you don’t want to be true as too bad a problem, a bit wasteful, but that’s it. “Motivated stopping” is more likely to lead you astray, you need to remember to keep questioning things you agree with. In fact, this is a common bias that is often used by con artists, for example, and a good rule is a generalization of the common self-defense against con artists contained in the phrase, “If it looks to good to be true, it probably is.” If you are immediately tempted to accept or agree with an argument, take a second look.
Interesting and useful if true