I don’t understand why you say their advice is pretty awful. The link you give argues that being aware of cognitive biases may cause irrational people to merely ascribe them to others as a rhetorical ploy, but the cracked.com article always recommends recognizing them in your own thinking.
I like their advice, looks quite solid to me. Perhaps you could elaborate?
After re-reading the article, “awful” was too strong a word. But I still think their advice is bad.
On rationalization (#5):
You do this, too. If you’re a human being, you’re from a long line of people who got to the winner’s circle again and again by ignoring facts in favor of advancing your side. So, the next time you find yourself desperately Googling for some factual example that proves your argument is right, and failing to find even one, stop. See if you can put the brakes on and actually say, out loud, “Wait a second. If the things I’m saying in order to bolster my argument are consistently wrong, then maybe my argument is also wrong.”
This is actually pretty good, it’s definitely the best piece of advice in the article. The reason I linked to Knowing About Biases Can Hurt People: when I first learned the concept of rationalization (I was pretty young), I went around accusing all my friends of doing it during our political discussions. It’s wasn’t until I read Knowing About Biases Can Hurt People that I recognized the retrospectively-obvious wrongness of using “rationalization!” as a counterargument. This isn’t explicitly stated in the Cracked article, but it is more strongly implied than I thought after my first read-through, so I’ll retract my criticism on this point. The rest of their advice, though, is much worse:
On neglect of probability (#4):
Again, everybody does it. The only difference is which issue is so charged for us that we’re willing to throw probability out the window. Look, we realize not everyone is going to stop shouting at each other at protests, sit down and go over the numbers. But maybe take a deep breath and think twice before the next time you tell someone: “We’ll see how you feel when it happens to you!”
Saying “don’t throw probability out the window” doesn’t really accomplish anything—you have to be aware of the cognitive landmines that stand in your way when you do use probability, otherwise you may end up being worse off than before. Additionally, there are specific ways in which we abuse probability, and each of these errors have to be overcome. (Examples: base rate neglect, gambler’s fallacy, conjunction fallacy, scope neglect.) You also need to know how to correctly use probabilities (Bayes’ theorem, simple probability theory, etc.).
On paranoia (#3):
Do you support the Occupy Wall Street movement? If so, do you find it frustrating when opponents claim the protesters have a hidden agenda and are just tools of the communists? Do you support the Tea Party? Do you find it frustrating when opponents dismiss the movement as a bunch of racists? No matter what side you’re on, you’ve played that game, and all it does is give you an excuse to ignore everything the other person says. You’re dismissing their points as lies, they’re doing the same to you, so why are you even having the conversation? Because you like making everyone else at the dinner table feel tense and awkward? Either admit that maybe this person honestly thinks what they’re saying is true, or just talk about sports.
Maybe I’ve been reading too much Robin Hanson, but this isn’t always good advice. People generally don’t believe things for the reasons they claim, and though taking other people at face value is usually the most polite option, it isn’t always the most correct.
On correspondence bias (#2):
Forget about talking politics with your crazy shop teacher for a second. If you’re consistently doing this when arguing with your significant other, that’s a good sign that the relationship is dying. Listen for it—when you forgot to do the dishes, it was because you had a thousand other things on your mind. When she forgot, it’s because she doesn’t care. If you find yourself automatically dismissing your partner’s explanations as “excuses,” you’ve gone to a bad place from which most relationships do not return.
This may be good relationship advice, I don’t know, but it definitely won’t help you overcome the fundamental attribution error. They don’t really tell you how to overcome this bias at all.
On ignoring the facts (#1):
You won’t remember this. You’re hard-wired to remain entrenched, and the Internet makes it worse because your political beliefs are pasted all over Facebook and wherever else you post your opinions. Backing down means going back on all that. It means letting down your team. Every inch of your psychology will fight it. Technology gives us the power to be wrong forever. The scary part? The same logical fallacy that prevents that crazy guy who keeps predicting the end of the world over and over from admitting maybe he was full of shit is the same fallacy that drives partisan politics and, therefore, government policy. Sleep tight, voters! Evolution is working against us.
This is mostly true—and frighteningly so—but, again, it isn’t helpful. It’s basically saying that nothing will stop you from believing what you want to believe, and no amount of studying cognitive science will change that. While it is true that completely overcoming cognitive biases is probably impossible without significant brain modification, to argue that this invalidates studying rationality is a fallacy of gray.
I don’t understand why you say their advice is pretty awful. The link you give argues that being aware of cognitive biases may cause irrational people to merely ascribe them to others as a rhetorical ploy, but the cracked.com article always recommends recognizing them in your own thinking.
I like their advice, looks quite solid to me. Perhaps you could elaborate?
After re-reading the article, “awful” was too strong a word. But I still think their advice is bad.
On rationalization (#5):
This is actually pretty good, it’s definitely the best piece of advice in the article. The reason I linked to Knowing About Biases Can Hurt People: when I first learned the concept of rationalization (I was pretty young), I went around accusing all my friends of doing it during our political discussions. It’s wasn’t until I read Knowing About Biases Can Hurt People that I recognized the retrospectively-obvious wrongness of using “rationalization!” as a counterargument. This isn’t explicitly stated in the Cracked article, but it is more strongly implied than I thought after my first read-through, so I’ll retract my criticism on this point. The rest of their advice, though, is much worse:
On neglect of probability (#4):
Saying “don’t throw probability out the window” doesn’t really accomplish anything—you have to be aware of the cognitive landmines that stand in your way when you do use probability, otherwise you may end up being worse off than before. Additionally, there are specific ways in which we abuse probability, and each of these errors have to be overcome. (Examples: base rate neglect, gambler’s fallacy, conjunction fallacy, scope neglect.) You also need to know how to correctly use probabilities (Bayes’ theorem, simple probability theory, etc.).
On paranoia (#3):
Maybe I’ve been reading too much Robin Hanson, but this isn’t always good advice. People generally don’t believe things for the reasons they claim, and though taking other people at face value is usually the most polite option, it isn’t always the most correct.
On correspondence bias (#2):
This may be good relationship advice, I don’t know, but it definitely won’t help you overcome the fundamental attribution error. They don’t really tell you how to overcome this bias at all.
On ignoring the facts (#1):
This is mostly true—and frighteningly so—but, again, it isn’t helpful. It’s basically saying that nothing will stop you from believing what you want to believe, and no amount of studying cognitive science will change that. While it is true that completely overcoming cognitive biases is probably impossible without significant brain modification, to argue that this invalidates studying rationality is a fallacy of gray.