Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it’s an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.
In the end, truth will out. Won’t it?
Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information.
There are a number of ways you can run with this article. It is interesting seeing it in the major press. It is also a little ironic that it is presenting facts to try and overturn an opinion (that information cannot be good for trying to overturn an opinion).
In terms of existential risk and thinking better in general. Obviously sometimes facts can overturn opinions but it makes me wonder, where is the organisation that uses non-fact based methods to sway opinion about existential risk. It would make sense if they were seperate, the fact based organisations (SIAI, FHI) need to be honest so that people that are fact-phillic to their message will trust them. I tend to ignore the fact-phobic (with respect to existential risk) people. But if it became sufficiently clear that foom style AI was possible, engineering society would become necessary.
One avenue may involve self-esteem. Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t.
I have long been thinking that the openly aggressive approach some display in promoting atheism / political ideas / whatever seems counterproductive, and more likely to make the other people not listen than it is to make them listen. These results seem to support that, though there have also been contradictory reports from people saying that the very aggressiveness was what made them actually think.
I’d guess aggression would have a polarising affect, depending upon ingroup or outgroup affiliation.
Aggression from an member of your own group is directed at something important that you ought to take note of. Aggression from an outsider is possibly directed at you so something to be ignored (if not credible) or countered.
We really need some students to do some tests upon, or a better way of searching psych research than google.
Data point: After years of having the correct arguments in my hand, having indeed generated many of them myself, and simply refusing to update, Eliezer, Cectic, and Dan Meissler ganged up on me and got the job done.
I think Jesus and Mo helped too, now I think of it. That period’s already getting murky in my head =/
Anyhow, point is, none of the above are what you’d call gentle.
ETA: I really do think humor is incredibly corrosive to religion. Years before this, the closest I ever came to deconversion was right after I read “Kissing Hank’s Ass”
These results seem to support that, though there have also been contradictory reports from people saying that the very aggressiveness was what made them actually think.
Presumably there’s heterogeneity in people’s reactions to aggressiveness and to soft approaches. Most likely a minority of people react better to aggressive approaches and most people react better to being fed opposing arguments in a sandwich with self-affirmation bread.
I have long been thinking that the openly aggressive approach some display in promoting atheism / political ideas / whatever seems counterproductive, and more likely to make the other people not listen than it is to make them listen.
I believe aggressive debates are not about convincing the people you are debating with, that is likely to be impossible. Instead it is about convincing third parties who have not yet made up their mind. For that purpose it might be better to take an overly extreme position and to attack your opponents as much as possible.
I think one of the reasons this self-esteem seeding works is that identifying your core values makes other issues look less important.
On the other hand, if you e.g. independently expressed that God is an important element of your identity and belief in him is one of your treasured values, then it may backfire and you will be even harder to move you away from that. (Of course I am not sure: I have never seen any scientific data on that. This is purely a wild guess.)
How facts Backfire
There are a number of ways you can run with this article. It is interesting seeing it in the major press. It is also a little ironic that it is presenting facts to try and overturn an opinion (that information cannot be good for trying to overturn an opinion).
In terms of existential risk and thinking better in general. Obviously sometimes facts can overturn opinions but it makes me wonder, where is the organisation that uses non-fact based methods to sway opinion about existential risk. It would make sense if they were seperate, the fact based organisations (SIAI, FHI) need to be honest so that people that are fact-phillic to their message will trust them. I tend to ignore the fact-phobic (with respect to existential risk) people. But if it became sufficiently clear that foom style AI was possible, engineering society would become necessary.
Interesting tidbit from the article:
I have long been thinking that the openly aggressive approach some display in promoting atheism / political ideas / whatever seems counterproductive, and more likely to make the other people not listen than it is to make them listen. These results seem to support that, though there have also been contradictory reports from people saying that the very aggressiveness was what made them actually think.
I’d guess aggression would have a polarising affect, depending upon ingroup or outgroup affiliation.
Aggression from an member of your own group is directed at something important that you ought to take note of. Aggression from an outsider is possibly directed at you so something to be ignored (if not credible) or countered.
We really need some students to do some tests upon, or a better way of searching psych research than google.
Data point: After years of having the correct arguments in my hand, having indeed generated many of them myself, and simply refusing to update, Eliezer, Cectic, and Dan Meissler ganged up on me and got the job done.
I think Jesus and Mo helped too, now I think of it. That period’s already getting murky in my head =/
Anyhow, point is, none of the above are what you’d call gentle.
ETA: I really do think humor is incredibly corrosive to religion. Years before this, the closest I ever came to deconversion was right after I read “Kissing Hank’s Ass”
Presumably there’s heterogeneity in people’s reactions to aggressiveness and to soft approaches. Most likely a minority of people react better to aggressive approaches and most people react better to being fed opposing arguments in a sandwich with self-affirmation bread.
I believe aggressive debates are not about convincing the people you are debating with, that is likely to be impossible. Instead it is about convincing third parties who have not yet made up their mind. For that purpose it might be better to take an overly extreme position and to attack your opponents as much as possible.
I think one of the reasons this self-esteem seeding works is that identifying your core values makes other issues look less important.
On the other hand, if you e.g. independently expressed that God is an important element of your identity and belief in him is one of your treasured values, then it may backfire and you will be even harder to move you away from that. (Of course I am not sure: I have never seen any scientific data on that. This is purely a wild guess.)
The primary study in question is here. I haven’t been able to locate online a copy of the study about self-esteem and corrections.