tl;dr: The smarter you are, the less likely you are to change your mind on certain issues when presented with new information, even when the new information is very clearly, simply, and unambiguously against your point of view.
The smarter you are, the less likely you are to change your mind on certain issues when presented with new information
In an adversarial setting—e.g. in the middle of culture warfare—this is an entirely valid response.
If you just blindly update on everything and I control what evidence you see, I can make you believe anything with arbitrarily high credence. Note that this does not necessarily involve any lying, just proper filtering.
That’s just rationalization. Again, even in the context of a simple hypothetical example with very clear and unfiltered evidence, participants were not willing to change their minds. I suggest you look at the actual study.
What is just rationalization? It seems pretty obvious to me that if your stream of evidence is filtered in some way, you should be very wary about updating on it.
Yes but that does not apply to this study; the participants weren’t even willing to acknowledge the statistical results when it disagreed with their point of view. Let alone changing their minds.
About your point about having evidence selectively presented, it’s easy to discard all information that disagrees with your worldview. What’s hard is actually changing your mind. If you believe that there is a ‘culture war’ going on with filtering of evidence and manipulation from all sides, the rational response would be to look at all evidence skeptically, not just the evidence that disagrees with you. Yet in that study, participants had no problem accepting statistical results that agreed with them or that they didn’t have strong political opinions about. And, importantly, this behaviour got worse for smarter people.
I’m not much interested in that particular study. I’m discussing your tl;dr which is
The smarter you are, the less likely you are to change your mind on certain issues when presented with new information
You, clearly, think this is bad. I, on the contrary, think that in certain situations—to wit, when your stream of evidence is filtered—NOT updating on new information is a good idea.
I feel this is a more interesting issue than going into the details of that study.
The smarter you are, the less likely you are to change your mind on certain issues when presented with new information, even when the new information is very clearly, simply, and unambiguously against your point of view.
Also, as George Orwell said “There are some ideas so absurd that only an intellectual could believe them”.
While that is the way Ezra Klein is interpreting it, I don’t think that’s exactly right. It’s not that smart people are less likely to change their mind; it’s that smart people who are also partisan are less likely to change their mind. The combination of intelligence and closedmindedness is dangerous; I would agree. But I believe intelligence is correlated with openmindedness so this is either a very narrow effect (which is what Ezra Klein seems to be suggesting) or an artifact of the study design.
Actually, never mind for part of this. I had assumed they were using the median to divide between conservative and liberal in which case people who identified as moderate would be thrown out, but they’re using the mean which is most likely a number in between the possible options, so everybody gets included. Moderates are included with either liberals or conservatives; I’m not sure which.
I don’t think openmindedness is the same as the ability to get the math right for emotionally charged topics.
The ability to get the math right in context like that is part of what Keith Stanovich wants to measure with the rationality index.
Unfortunately in writing the article Vox themselves seem to have fallen prey to some of the same stupidity; if you’re familiar with Vox’s general left-wing sympathies you’ll be unsurprised that the examples of stupidity used in the article are overwhelmingly from right-wing sources. If you really want to improve people’s thinking, you need to focus on your own tribe at least as much as the enemy tribe.
The example they give is actually anti gun control (it is a contrived example of course) and they repeatedly mention that the biases in question affect individuals who identify as left-wing as well as individuals who identify as right-wing.
If you really want to improve people’s thinking, you need to focus on your own tribe at least as much as the enemy tribe.
Why? I looked at your linked article and the two articles it links to and I can’t find any proof that doing what you say would result in fewer disagreements than not doing that.
Interesting article on vox (not a new one, but it’s the first time I’ve seen it and I thought I’d share; apologies if it’s been featured here before) on ‘how politics makes us stupid’: http://www.vox.com/2014/4/6/5556462/brain-dead-how-politics-makes-us-stupid
tl;dr: The smarter you are, the less likely you are to change your mind on certain issues when presented with new information, even when the new information is very clearly, simply, and unambiguously against your point of view.
In an adversarial setting—e.g. in the middle of culture warfare—this is an entirely valid response.
If you just blindly update on everything and I control what evidence you see, I can make you believe anything with arbitrarily high credence. Note that this does not necessarily involve any lying, just proper filtering.
That’s just rationalization. Again, even in the context of a simple hypothetical example with very clear and unfiltered evidence, participants were not willing to change their minds. I suggest you look at the actual study.
What is just rationalization? It seems pretty obvious to me that if your stream of evidence is filtered in some way, you should be very wary about updating on it.
Yes but that does not apply to this study; the participants weren’t even willing to acknowledge the statistical results when it disagreed with their point of view. Let alone changing their minds.
About your point about having evidence selectively presented, it’s easy to discard all information that disagrees with your worldview. What’s hard is actually changing your mind. If you believe that there is a ‘culture war’ going on with filtering of evidence and manipulation from all sides, the rational response would be to look at all evidence skeptically, not just the evidence that disagrees with you. Yet in that study, participants had no problem accepting statistical results that agreed with them or that they didn’t have strong political opinions about. And, importantly, this behaviour got worse for smarter people.
I’m not much interested in that particular study. I’m discussing your tl;dr which is
You, clearly, think this is bad. I, on the contrary, think that in certain situations—to wit, when your stream of evidence is filtered—NOT updating on new information is a good idea.
I feel this is a more interesting issue than going into the details of that study.
Also, as George Orwell said “There are some ideas so absurd that only an intellectual could believe them”.
While that is the way Ezra Klein is interpreting it, I don’t think that’s exactly right. It’s not that smart people are less likely to change their mind; it’s that smart people who are also partisan are less likely to change their mind. The combination of intelligence and closedmindedness is dangerous; I would agree. But I believe intelligence is correlated with openmindedness so this is either a very narrow effect (which is what Ezra Klein seems to be suggesting) or an artifact of the study design.
Actually, never mind for part of this. I had assumed they were using the median to divide between conservative and liberal in which case people who identified as moderate would be thrown out, but they’re using the mean which is most likely a number in between the possible options, so everybody gets included. Moderates are included with either liberals or conservatives; I’m not sure which.
I don’t think openmindedness is the same as the ability to get the math right for emotionally charged topics. The ability to get the math right in context like that is part of what Keith Stanovich wants to measure with the rationality index.
Unfortunately in writing the article Vox themselves seem to have fallen prey to some of the same stupidity; if you’re familiar with Vox’s general left-wing sympathies you’ll be unsurprised that the examples of stupidity used in the article are overwhelmingly from right-wing sources. If you really want to improve people’s thinking, you need to focus on your own tribe at least as much as the enemy tribe.
I previously wrote about this here.
The example they give is actually anti gun control (it is a contrived example of course) and they repeatedly mention that the biases in question affect individuals who identify as left-wing as well as individuals who identify as right-wing.
Why? I looked at your linked article and the two articles it links to and I can’t find any proof that doing what you say would result in fewer disagreements than not doing that.