We actually label persuasive strategies that can be used to market our true ideas as “dark arts”.
The linked page specifies that the “dark arts” specifically take advantage of biases for persuasion. So it’s a bit misleading to say “We actually label persuasive strategies that can be used…”, because we do not label all strategies as such. Our goal should be to snap people out of their biases, so that they can naturally accept anything that turns out to be true. That could be taken as a “persuasive strategy”, but it is not a dark one.
I’d favour attempts to develop bias-busting techniques intended to be used on general audiences. (Have there been any discussions about this — developing, as it were, a Defense Against the Dark Arts curriculum?) I would oppose attempts to evangelize our conclusions to general audiences without imparting to them the underlying framework of rationality that would allow them to independently discover or at least verify these conclusions. Using sophistry to persuade someone of a truth isn’t much better than using the same tricks to persuade them of a falsehood, and even if we ignore moral issues, it is nearly useless, because if a person ends up with better ideas but all the same biases, their heads can later just as easily be filled with whole new sets of bad ideas by other Dark Arts practitioners.
Edit: I’d like to add that I don’t mean that I believe biases can be countered with Pure Reason — which, I suppose, is what makes them bisaes in the first place. As the saying goes, we can’t reason people of what they were never reasoned into. Debiasing techniques will not consist solely of rational arguments based on evidence, because the goal is to get people to the point where they can accept such things in the first place. But that does not mean that we have to, or ought to, resort to actively bad reasoning. (Especially to someone with an anti-reason worldview — serious anti-epistemology — invalid logical arguments don’t work any better than valid ones anyway.) That, I think, is something that the prominent members of the “New Atheist” movement (it’s not all that new, I know) are getting right (for the most part). This movement is unapologetic, it’s emotional, it can be abrasive and offensive at times, but it’s not dishonest. As one example of that, see, for instance, PZ Myers strongly criticizing a recent study which, in part, appears to show a positive correlation between IQ and atheism. He didn’t have to criticize it. He could have been among those “patting [them]selves on the back”. But he sided with intellectual honesty over embracing a potentially flawed argumentative tool. If we want to spread rationality, we should be thinking along the same lines.
Thanks for the detailed reply—I’ll try to respond to each of your points.
First of all, using dark arts does not imply you have to tell outright lies.
Secondly, you say “if a person ends up with better ideas but all the same biases, their heads can later just as easily be filled with whole new sets of bad ideas by other Dark Arts practitioners.” When the alternative is that they only had bad ideas in their head, this is a still a win. And your example is the minimum win possible. What if we used dark arts to help someone remove a cognitive bias? Is it now justified?
Third, PZ Myer chose a very effective persuasion strategy, The Admirable Admission Pitch. However, one case where someone was effective sans-dark arts hardly proves the sans-dark arts approach is optimal in general. When you look at a heavy-weight persuader of the world like Al Gore, you can see he makes heavy use of the dark arts.
Finally, you’re correct with respect to the problem you pointed out in your 1st paragarph. I’ll tweak the post to fix it.
First of all, using dark arts does not imply you have to tell outright lies.
It does, in a very real way: if you say “You should believe X because Y”, and Y is not actually a good reason to believe X, you have spoken a falsehood, whether Y is a lie about external reality or just some rhetorical trick.
Secondly, you say “if a person ends up with better ideas but all the same biases, their heads can later just as easily be filled with whole new sets of bad ideas by other Dark Arts practitioners.” When the alternative is that they only had bad ideas in their head, this is a still a win.
I am not convinced of this. You are right in a strictly utilitarian sense — it is better for someone to have good ideas for bad reasons than to have bad ideas for bad reasons — but in most cases, it’s a false dilemma. At most, using the Dark Arts is only justified if someone is absolutely intractable in attempts to debias them. Using them too soon could amount to a missed opportunity — causing us to declare victory and move on too quickly, and/or causing them to be even less open to a more fundamental debiasing afterwards. Let’s say we convince a Christian to believe in evolution by arguing as though it can be reconciled with a day-age reading of Genesis. This is a bit better than them being a literalist young-earth creationist, but they have not become more rational. And if you convince them of evolution thusly and then try to convince them that their religious epistemology is wrong altogether, I can imagine them saying “Come on, you convinced me to believe in evolution, what more would you want from me?” or “Liar! You said believing in evolution wouldn’t be a slippery slope to atheism!” or “What, so I believed that science was incompatible with my religion, and you convinced me they’re compatible after all, but now you’ve switched to arguing that they aren’t?”. If you want to make someone question their deepest and most cherished beliefs, they are likely to take you even less seriously if you previously convinced them of a lesser point by acting like those beliefs could be true.
(That is a hypothesis, supported only by some personal experience and intuition. It can probably be tested; until then, I invite anyone with personal experience or other thoughts on this point, whether supporting or opposing it, to share them.)
And your example is the minimum win possible. What if we used dark arts to help someone remove a cognitive bias? Is it now justified?
I’m not quite sure what that would look like. Do you think you could formulate an example?
I might be persuaded that this is justified. It would be something like writing a computer virus whose purpose is to patch the security holes it uses to get in in the first place. But before making any judgment I’d still like an example of how it could work in this context.
Third, PZ Myer chose a very effective persuasion strategy, The Admirable Admission Pitch. However, one case where someone was effective sans-dark arts hardly proves the sans-dark arts approach is optimal in general.
You write that “it can disarm opponents, and portray you as a paragon of reasonableness and open-mindedness.” I seriously doubt that his usual opponents will find him more reasonable, open-minded, or respectable on this basis (at least in any lasting way), and I expect he knows that perfectly well.
Still, it was just one example, and I could be wrong. I’ll retract that argument. You get to the deeper point when you question whether “the sans-dark arts approach is optimal in general”. Optimal for what? Perhaps we don’t even care about the same underlying goals. If someone cares, for instance, about promoting belief in evolution, then the Dark Arts might help. If someone cares about promoting atheism, the Dark Arts might help. But since we’re talking here about promoting rationality itself, using the Dark Arts is promoting the very thing we are trying to destroy. It could cause all manners of misconceptions: 1) To the public: that rationalism is akin to sophism or a religion, and that we are not to be trusted; don’t listen to us or else we might trick you into changing your mind. 2) To the people we persuade: that we’re authorities to be trusted and accepted, not ordinary people setting a good example to follow. (Or, even worse: they become rational enough to notice how they were manipulated, but then they do follow the example we were setting, concluding that the Dark Arts are an okay thing to use in general.) 3) To people we fail to persuade: that we’re dangerous liars who must be stopped.
It’s simply the opposite of the kind of epistemology we want to promote. I think the worst part would be the bad example it sets.
To use the virus metaphor again, this is like a security expert who finds exploits and reports them so they can be fixed. (Or, more closely analogous to this video, working with an individual and delivering an active but harmless virus to their computer so they will become aware of and concerned about the potential for real harm.) The grey area, somehow using invalid but persuasive arguments against people’s actual biases, is like my previous example making a virus that patches the very holes it uses to get in. Using the Dark Arts is like using those exploits to install things on people’s computers on the basis that you’re only using it to install really good software that people ought to have anyway.
So, showing someone by demonstration how a particular thought pattern serves them poorly is not what I’m talking about. That’s a good thing. (I was going to say it’s “not the Dark Arts”, but so we don’t get into arguing about the definition, I’ll just say that this is an example of something I support, while I think I’ve given enough examples of the sort of thing that I don’t support for the distinction to be clear. It is indeed pretty much what Bongo is saying. My point right now is that the two concepts are different enough that we shouldn’t be referring to them with the same term, especially a connotation-heavy one like “Dark Arts”.)
Saying there are white arts as well as dark ones is conceding the point, isn’t it? One should be allowed to be persuasive as well as right, and sometimes just being right isn’t enough, especially if the audience is judging the surface appeal of an argument (and maybe even accepting it or not!) prior to digging into it’s meat. In such situations, attractive wrapping isn’t just pretty, it’s a prerequisite. So, I love your idea of inventing a protocol for DAtDA.
The linked page specifies that the “dark arts” specifically take advantage of biases for persuasion. So it’s a bit misleading to say “We actually label persuasive strategies that can be used…”, because we do not label all strategies as such. Our goal should be to snap people out of their biases, so that they can naturally accept anything that turns out to be true. That could be taken as a “persuasive strategy”, but it is not a dark one.
I’d favour attempts to develop bias-busting techniques intended to be used on general audiences. (Have there been any discussions about this — developing, as it were, a Defense Against the Dark Arts curriculum?) I would oppose attempts to evangelize our conclusions to general audiences without imparting to them the underlying framework of rationality that would allow them to independently discover or at least verify these conclusions. Using sophistry to persuade someone of a truth isn’t much better than using the same tricks to persuade them of a falsehood, and even if we ignore moral issues, it is nearly useless, because if a person ends up with better ideas but all the same biases, their heads can later just as easily be filled with whole new sets of bad ideas by other Dark Arts practitioners.
Edit: I’d like to add that I don’t mean that I believe biases can be countered with Pure Reason — which, I suppose, is what makes them bisaes in the first place. As the saying goes, we can’t reason people of what they were never reasoned into. Debiasing techniques will not consist solely of rational arguments based on evidence, because the goal is to get people to the point where they can accept such things in the first place. But that does not mean that we have to, or ought to, resort to actively bad reasoning. (Especially to someone with an anti-reason worldview — serious anti-epistemology — invalid logical arguments don’t work any better than valid ones anyway.) That, I think, is something that the prominent members of the “New Atheist” movement (it’s not all that new, I know) are getting right (for the most part). This movement is unapologetic, it’s emotional, it can be abrasive and offensive at times, but it’s not dishonest. As one example of that, see, for instance, PZ Myers strongly criticizing a recent study which, in part, appears to show a positive correlation between IQ and atheism. He didn’t have to criticize it. He could have been among those “patting [them]selves on the back”. But he sided with intellectual honesty over embracing a potentially flawed argumentative tool. If we want to spread rationality, we should be thinking along the same lines.
Thanks for the detailed reply—I’ll try to respond to each of your points.
First of all, using dark arts does not imply you have to tell outright lies.
Secondly, you say “if a person ends up with better ideas but all the same biases, their heads can later just as easily be filled with whole new sets of bad ideas by other Dark Arts practitioners.” When the alternative is that they only had bad ideas in their head, this is a still a win. And your example is the minimum win possible. What if we used dark arts to help someone remove a cognitive bias? Is it now justified?
Third, PZ Myer chose a very effective persuasion strategy, The Admirable Admission Pitch. However, one case where someone was effective sans-dark arts hardly proves the sans-dark arts approach is optimal in general. When you look at a heavy-weight persuader of the world like Al Gore, you can see he makes heavy use of the dark arts.
Finally, you’re correct with respect to the problem you pointed out in your 1st paragarph. I’ll tweak the post to fix it.
It does, in a very real way: if you say “You should believe X because Y”, and Y is not actually a good reason to believe X, you have spoken a falsehood, whether Y is a lie about external reality or just some rhetorical trick.
I am not convinced of this. You are right in a strictly utilitarian sense — it is better for someone to have good ideas for bad reasons than to have bad ideas for bad reasons — but in most cases, it’s a false dilemma. At most, using the Dark Arts is only justified if someone is absolutely intractable in attempts to debias them. Using them too soon could amount to a missed opportunity — causing us to declare victory and move on too quickly, and/or causing them to be even less open to a more fundamental debiasing afterwards. Let’s say we convince a Christian to believe in evolution by arguing as though it can be reconciled with a day-age reading of Genesis. This is a bit better than them being a literalist young-earth creationist, but they have not become more rational. And if you convince them of evolution thusly and then try to convince them that their religious epistemology is wrong altogether, I can imagine them saying “Come on, you convinced me to believe in evolution, what more would you want from me?” or “Liar! You said believing in evolution wouldn’t be a slippery slope to atheism!” or “What, so I believed that science was incompatible with my religion, and you convinced me they’re compatible after all, but now you’ve switched to arguing that they aren’t?”. If you want to make someone question their deepest and most cherished beliefs, they are likely to take you even less seriously if you previously convinced them of a lesser point by acting like those beliefs could be true.
(That is a hypothesis, supported only by some personal experience and intuition. It can probably be tested; until then, I invite anyone with personal experience or other thoughts on this point, whether supporting or opposing it, to share them.)
I’m not quite sure what that would look like. Do you think you could formulate an example?
I might be persuaded that this is justified. It would be something like writing a computer virus whose purpose is to patch the security holes it uses to get in in the first place. But before making any judgment I’d still like an example of how it could work in this context.
You write that “it can disarm opponents, and portray you as a paragon of reasonableness and open-mindedness.” I seriously doubt that his usual opponents will find him more reasonable, open-minded, or respectable on this basis (at least in any lasting way), and I expect he knows that perfectly well.
Still, it was just one example, and I could be wrong. I’ll retract that argument. You get to the deeper point when you question whether “the sans-dark arts approach is optimal in general”. Optimal for what? Perhaps we don’t even care about the same underlying goals. If someone cares, for instance, about promoting belief in evolution, then the Dark Arts might help. If someone cares about promoting atheism, the Dark Arts might help. But since we’re talking here about promoting rationality itself, using the Dark Arts is promoting the very thing we are trying to destroy. It could cause all manners of misconceptions: 1) To the public: that rationalism is akin to sophism or a religion, and that we are not to be trusted; don’t listen to us or else we might trick you into changing your mind. 2) To the people we persuade: that we’re authorities to be trusted and accepted, not ordinary people setting a good example to follow. (Or, even worse: they become rational enough to notice how they were manipulated, but then they do follow the example we were setting, concluding that the Dark Arts are an okay thing to use in general.) 3) To people we fail to persuade: that we’re dangerous liars who must be stopped.
It’s simply the opposite of the kind of epistemology we want to promote. I think the worst part would be the bad example it sets.
Video of killing a cognitive bias with dark arts: http://www.youtube.com/watch?v=haP7Ys9ocTk
(Also illustrates Bongo’s first bullet point in a comment above)
To use the virus metaphor again, this is like a security expert who finds exploits and reports them so they can be fixed. (Or, more closely analogous to this video, working with an individual and delivering an active but harmless virus to their computer so they will become aware of and concerned about the potential for real harm.) The grey area, somehow using invalid but persuasive arguments against people’s actual biases, is like my previous example making a virus that patches the very holes it uses to get in. Using the Dark Arts is like using those exploits to install things on people’s computers on the basis that you’re only using it to install really good software that people ought to have anyway.
So, showing someone by demonstration how a particular thought pattern serves them poorly is not what I’m talking about. That’s a good thing. (I was going to say it’s “not the Dark Arts”, but so we don’t get into arguing about the definition, I’ll just say that this is an example of something I support, while I think I’ve given enough examples of the sort of thing that I don’t support for the distinction to be clear. It is indeed pretty much what Bongo is saying. My point right now is that the two concepts are different enough that we shouldn’t be referring to them with the same term, especially a connotation-heavy one like “Dark Arts”.)
Yes, this is one of the reasons I have serious doubts about global warming.
Saying there are white arts as well as dark ones is conceding the point, isn’t it? One should be allowed to be persuasive as well as right, and sometimes just being right isn’t enough, especially if the audience is judging the surface appeal of an argument (and maybe even accepting it or not!) prior to digging into it’s meat. In such situations, attractive wrapping isn’t just pretty, it’s a prerequisite. So, I love your idea of inventing a protocol for DAtDA.