Try this: Choose a book that you expect to disagree with and read it from start to finish over several weeks. See what impact it has on you. I tried this and felt my beliefs changing despite none of the arguments being convincing. It seemed to peter out a few weeks after I finished the book. I hypothesize that in an extended experiment we could actually brainwash ourselves to the point of holding some radically different views.
The psychology research I’m aware of suggests the opposite effect if anything—reading opposing views tends to make you more sure of your original views. This is a feature of confirmation bias.
The study described in the link only exposed the subject to a single article. The effect might be different for different amounts of exposure.
In my own experience this seems to be the case. When I briefly read politically opposing blogs I find them so obviously stupid that I’m amazed anyone could take the other side seriously, but when I spend a long while doing it I find my views moderating and sometimes even crossing over despite not being convinced by any of their actual arguments, and begin to be embarrassed by figures I normally admire even though most of what I find directed against them are mere pejoratives. Then afterward the effect wears off. I could be unusually easily-led, but I’ve heard of enough other similar experiences that I doubt it.
This is actually quite useful, and it’s how I got to the political views I have today. I started out very liberal, having been raised by liberals. I realized that this was a problem for the reasons in the OP, and also that I was biased by my need to identify with the group. To break myself out of these thought patterns, I read The Fountainhead with as open and accepting a mind as possible. It didn’t contain much in the way of rational argument, but it shook up my belief system enough that I no longer feel the same tribal attachment to any political party. This in turn let me form an opinion on each issue based on evidence, and I now hold some opinions that would be called “liberal” and others that would be called “conservative”.
Can’t say I’m surprised, since I was about to mention my own reaction to Atlas Shrugged. I have continuously, near-obsessively dissected the book (both in terms of the rationality of its arguments and the quality of writing it contains)… and I still find my views changing, or at least my initial reactions changing, the more of it I read. It’s a very odd experience. I have no idea what would have happened if I’d started reading it younger (I’m 28) and less aware of the way that politics and business proceed in the real world.
With that said, I think the effect is a net positive. I now see more of the stuff that Objectivists object to in the everyday world—it grabs my attention much more to hear somebody say something like “well, he really needed the job”—but it doesn’t seem to have interfered with my ability to analyze the situation (for example, when multiple candidates are sufficiently qualified and no other differences are significant, it is the most productive thing to give a job to the qualified person who needs it most). Picking apart the places where Rand is wrong, or at least fails to make a convincing argument, has both equipped me to argue against those viewpoints when expressed by others, and has heightened my ability to see the places where she’s right.
Bringing this back on topic, though, I’m not sure how parallel the scenarios (reading a book by choice but with the conscious intention of exploring the author’s ideas and biases vs. picking up biases by accident from a teacher) really are. Part of that may be that I do not automatically associate a book with the personhood of the author, the way I associate a class with the personhood of the teacher (indeed, I have to constantly consciously remind myself of Rand’s own experiences to even begin to comprehend some of her arguments; I have never had to similarly remind myself more than once when dealing with a person in the flesh). I certainly internalize lessons and viewpoints much more in person than I do from a text.
Relatedly, I need to get myself to some presentations and/or workshops on rationality, as I’m new and still find many of the concepts that I am trying to learn are… slippery, in a way that things I learned from a “real person” almost never are. Of course, the fact that I’m trying to become more rational, while I am in no way trying to become Objectivist, may make a big difference. Too many axes for the data that I have, I think, though further analysis may show otherwise.
I hypothesize that in an extended experiment we could actually brainwash ourselves to the point of holding some radically different views.
This is the first of Lifton’s eight criteria for thought reform—one is systematically exposed only to one side of evidence, and isolated from the other side of evidence.
To make this brainwashing more effcient, there are additional techniques:
Live among people who will make you feel ashamed for not believing in X. These people should love you as a person, but hate any non-X-ness. Expose your doubts about X in front of the group—it will help them understand and modify your mind processes.
Develop a group-specific jargon—if you make your pro-X arguments using words that an outsider does not understand, then the outsider cannot refute these arguments.
If you ever experience or remember something that disagrees with X, be a good Bayesian and remember that with probability at least epsilon, your experience or memory is wrong. On the other hand, assing prior probability 1 to X. This way, whatever evidence you have, it is perfectly rational to believe in X.
Try this: Choose a book that you expect to disagree with and read it from start to finish over several weeks. See what impact it has on you. I tried this and felt my beliefs changing despite none of the arguments being convincing. It seemed to peter out a few weeks after I finished the book. I hypothesize that in an extended experiment we could actually brainwash ourselves to the point of holding some radically different views.
The psychology research I’m aware of suggests the opposite effect if anything—reading opposing views tends to make you more sure of your original views. This is a feature of confirmation bias.
The study described in the link only exposed the subject to a single article. The effect might be different for different amounts of exposure.
In my own experience this seems to be the case. When I briefly read politically opposing blogs I find them so obviously stupid that I’m amazed anyone could take the other side seriously, but when I spend a long while doing it I find my views moderating and sometimes even crossing over despite not being convinced by any of their actual arguments, and begin to be embarrassed by figures I normally admire even though most of what I find directed against them are mere pejoratives. Then afterward the effect wears off. I could be unusually easily-led, but I’ve heard of enough other similar experiences that I doubt it.
This is actually quite useful, and it’s how I got to the political views I have today. I started out very liberal, having been raised by liberals. I realized that this was a problem for the reasons in the OP, and also that I was biased by my need to identify with the group. To break myself out of these thought patterns, I read The Fountainhead with as open and accepting a mind as possible. It didn’t contain much in the way of rational argument, but it shook up my belief system enough that I no longer feel the same tribal attachment to any political party. This in turn let me form an opinion on each issue based on evidence, and I now hold some opinions that would be called “liberal” and others that would be called “conservative”.
Oddly enough, the book that prompted my post was Atlas Shrugged :)
Can’t say I’m surprised, since I was about to mention my own reaction to Atlas Shrugged. I have continuously, near-obsessively dissected the book (both in terms of the rationality of its arguments and the quality of writing it contains)… and I still find my views changing, or at least my initial reactions changing, the more of it I read. It’s a very odd experience. I have no idea what would have happened if I’d started reading it younger (I’m 28) and less aware of the way that politics and business proceed in the real world.
With that said, I think the effect is a net positive. I now see more of the stuff that Objectivists object to in the everyday world—it grabs my attention much more to hear somebody say something like “well, he really needed the job”—but it doesn’t seem to have interfered with my ability to analyze the situation (for example, when multiple candidates are sufficiently qualified and no other differences are significant, it is the most productive thing to give a job to the qualified person who needs it most). Picking apart the places where Rand is wrong, or at least fails to make a convincing argument, has both equipped me to argue against those viewpoints when expressed by others, and has heightened my ability to see the places where she’s right.
Bringing this back on topic, though, I’m not sure how parallel the scenarios (reading a book by choice but with the conscious intention of exploring the author’s ideas and biases vs. picking up biases by accident from a teacher) really are. Part of that may be that I do not automatically associate a book with the personhood of the author, the way I associate a class with the personhood of the teacher (indeed, I have to constantly consciously remind myself of Rand’s own experiences to even begin to comprehend some of her arguments; I have never had to similarly remind myself more than once when dealing with a person in the flesh). I certainly internalize lessons and viewpoints much more in person than I do from a text.
Relatedly, I need to get myself to some presentations and/or workshops on rationality, as I’m new and still find many of the concepts that I am trying to learn are… slippery, in a way that things I learned from a “real person” almost never are. Of course, the fact that I’m trying to become more rational, while I am in no way trying to become Objectivist, may make a big difference. Too many axes for the data that I have, I think, though further analysis may show otherwise.
This is the first of Lifton’s eight criteria for thought reform—one is systematically exposed only to one side of evidence, and isolated from the other side of evidence.
To make this brainwashing more effcient, there are additional techniques:
Live among people who will make you feel ashamed for not believing in X. These people should love you as a person, but hate any non-X-ness. Expose your doubts about X in front of the group—it will help them understand and modify your mind processes.
Develop a group-specific jargon—if you make your pro-X arguments using words that an outsider does not understand, then the outsider cannot refute these arguments.
If you ever experience or remember something that disagrees with X, be a good Bayesian and remember that with probability at least epsilon, your experience or memory is wrong. On the other hand, assing prior probability 1 to X. This way, whatever evidence you have, it is perfectly rational to believe in X.