Thanks for the detailed reply—I’ll try to respond to each of your points.
First of all, using dark arts does not imply you have to tell outright lies.
Secondly, you say “if a person ends up with better ideas but all the same biases, their heads can later just as easily be filled with whole new sets of bad ideas by other Dark Arts practitioners.” When the alternative is that they only had bad ideas in their head, this is a still a win. And your example is the minimum win possible. What if we used dark arts to help someone remove a cognitive bias? Is it now justified?
Third, PZ Myer chose a very effective persuasion strategy, The Admirable Admission Pitch. However, one case where someone was effective sans-dark arts hardly proves the sans-dark arts approach is optimal in general. When you look at a heavy-weight persuader of the world like Al Gore, you can see he makes heavy use of the dark arts.
Finally, you’re correct with respect to the problem you pointed out in your 1st paragarph. I’ll tweak the post to fix it.
First of all, using dark arts does not imply you have to tell outright lies.
It does, in a very real way: if you say “You should believe X because Y”, and Y is not actually a good reason to believe X, you have spoken a falsehood, whether Y is a lie about external reality or just some rhetorical trick.
Secondly, you say “if a person ends up with better ideas but all the same biases, their heads can later just as easily be filled with whole new sets of bad ideas by other Dark Arts practitioners.” When the alternative is that they only had bad ideas in their head, this is a still a win.
I am not convinced of this. You are right in a strictly utilitarian sense — it is better for someone to have good ideas for bad reasons than to have bad ideas for bad reasons — but in most cases, it’s a false dilemma. At most, using the Dark Arts is only justified if someone is absolutely intractable in attempts to debias them. Using them too soon could amount to a missed opportunity — causing us to declare victory and move on too quickly, and/or causing them to be even less open to a more fundamental debiasing afterwards. Let’s say we convince a Christian to believe in evolution by arguing as though it can be reconciled with a day-age reading of Genesis. This is a bit better than them being a literalist young-earth creationist, but they have not become more rational. And if you convince them of evolution thusly and then try to convince them that their religious epistemology is wrong altogether, I can imagine them saying “Come on, you convinced me to believe in evolution, what more would you want from me?” or “Liar! You said believing in evolution wouldn’t be a slippery slope to atheism!” or “What, so I believed that science was incompatible with my religion, and you convinced me they’re compatible after all, but now you’ve switched to arguing that they aren’t?”. If you want to make someone question their deepest and most cherished beliefs, they are likely to take you even less seriously if you previously convinced them of a lesser point by acting like those beliefs could be true.
(That is a hypothesis, supported only by some personal experience and intuition. It can probably be tested; until then, I invite anyone with personal experience or other thoughts on this point, whether supporting or opposing it, to share them.)
And your example is the minimum win possible. What if we used dark arts to help someone remove a cognitive bias? Is it now justified?
I’m not quite sure what that would look like. Do you think you could formulate an example?
I might be persuaded that this is justified. It would be something like writing a computer virus whose purpose is to patch the security holes it uses to get in in the first place. But before making any judgment I’d still like an example of how it could work in this context.
Third, PZ Myer chose a very effective persuasion strategy, The Admirable Admission Pitch. However, one case where someone was effective sans-dark arts hardly proves the sans-dark arts approach is optimal in general.
You write that “it can disarm opponents, and portray you as a paragon of reasonableness and open-mindedness.” I seriously doubt that his usual opponents will find him more reasonable, open-minded, or respectable on this basis (at least in any lasting way), and I expect he knows that perfectly well.
Still, it was just one example, and I could be wrong. I’ll retract that argument. You get to the deeper point when you question whether “the sans-dark arts approach is optimal in general”. Optimal for what? Perhaps we don’t even care about the same underlying goals. If someone cares, for instance, about promoting belief in evolution, then the Dark Arts might help. If someone cares about promoting atheism, the Dark Arts might help. But since we’re talking here about promoting rationality itself, using the Dark Arts is promoting the very thing we are trying to destroy. It could cause all manners of misconceptions: 1) To the public: that rationalism is akin to sophism or a religion, and that we are not to be trusted; don’t listen to us or else we might trick you into changing your mind. 2) To the people we persuade: that we’re authorities to be trusted and accepted, not ordinary people setting a good example to follow. (Or, even worse: they become rational enough to notice how they were manipulated, but then they do follow the example we were setting, concluding that the Dark Arts are an okay thing to use in general.) 3) To people we fail to persuade: that we’re dangerous liars who must be stopped.
It’s simply the opposite of the kind of epistemology we want to promote. I think the worst part would be the bad example it sets.
To use the virus metaphor again, this is like a security expert who finds exploits and reports them so they can be fixed. (Or, more closely analogous to this video, working with an individual and delivering an active but harmless virus to their computer so they will become aware of and concerned about the potential for real harm.) The grey area, somehow using invalid but persuasive arguments against people’s actual biases, is like my previous example making a virus that patches the very holes it uses to get in. Using the Dark Arts is like using those exploits to install things on people’s computers on the basis that you’re only using it to install really good software that people ought to have anyway.
So, showing someone by demonstration how a particular thought pattern serves them poorly is not what I’m talking about. That’s a good thing. (I was going to say it’s “not the Dark Arts”, but so we don’t get into arguing about the definition, I’ll just say that this is an example of something I support, while I think I’ve given enough examples of the sort of thing that I don’t support for the distinction to be clear. It is indeed pretty much what Bongo is saying. My point right now is that the two concepts are different enough that we shouldn’t be referring to them with the same term, especially a connotation-heavy one like “Dark Arts”.)
Thanks for the detailed reply—I’ll try to respond to each of your points.
First of all, using dark arts does not imply you have to tell outright lies.
Secondly, you say “if a person ends up with better ideas but all the same biases, their heads can later just as easily be filled with whole new sets of bad ideas by other Dark Arts practitioners.” When the alternative is that they only had bad ideas in their head, this is a still a win. And your example is the minimum win possible. What if we used dark arts to help someone remove a cognitive bias? Is it now justified?
Third, PZ Myer chose a very effective persuasion strategy, The Admirable Admission Pitch. However, one case where someone was effective sans-dark arts hardly proves the sans-dark arts approach is optimal in general. When you look at a heavy-weight persuader of the world like Al Gore, you can see he makes heavy use of the dark arts.
Finally, you’re correct with respect to the problem you pointed out in your 1st paragarph. I’ll tweak the post to fix it.
It does, in a very real way: if you say “You should believe X because Y”, and Y is not actually a good reason to believe X, you have spoken a falsehood, whether Y is a lie about external reality or just some rhetorical trick.
I am not convinced of this. You are right in a strictly utilitarian sense — it is better for someone to have good ideas for bad reasons than to have bad ideas for bad reasons — but in most cases, it’s a false dilemma. At most, using the Dark Arts is only justified if someone is absolutely intractable in attempts to debias them. Using them too soon could amount to a missed opportunity — causing us to declare victory and move on too quickly, and/or causing them to be even less open to a more fundamental debiasing afterwards. Let’s say we convince a Christian to believe in evolution by arguing as though it can be reconciled with a day-age reading of Genesis. This is a bit better than them being a literalist young-earth creationist, but they have not become more rational. And if you convince them of evolution thusly and then try to convince them that their religious epistemology is wrong altogether, I can imagine them saying “Come on, you convinced me to believe in evolution, what more would you want from me?” or “Liar! You said believing in evolution wouldn’t be a slippery slope to atheism!” or “What, so I believed that science was incompatible with my religion, and you convinced me they’re compatible after all, but now you’ve switched to arguing that they aren’t?”. If you want to make someone question their deepest and most cherished beliefs, they are likely to take you even less seriously if you previously convinced them of a lesser point by acting like those beliefs could be true.
(That is a hypothesis, supported only by some personal experience and intuition. It can probably be tested; until then, I invite anyone with personal experience or other thoughts on this point, whether supporting or opposing it, to share them.)
I’m not quite sure what that would look like. Do you think you could formulate an example?
I might be persuaded that this is justified. It would be something like writing a computer virus whose purpose is to patch the security holes it uses to get in in the first place. But before making any judgment I’d still like an example of how it could work in this context.
You write that “it can disarm opponents, and portray you as a paragon of reasonableness and open-mindedness.” I seriously doubt that his usual opponents will find him more reasonable, open-minded, or respectable on this basis (at least in any lasting way), and I expect he knows that perfectly well.
Still, it was just one example, and I could be wrong. I’ll retract that argument. You get to the deeper point when you question whether “the sans-dark arts approach is optimal in general”. Optimal for what? Perhaps we don’t even care about the same underlying goals. If someone cares, for instance, about promoting belief in evolution, then the Dark Arts might help. If someone cares about promoting atheism, the Dark Arts might help. But since we’re talking here about promoting rationality itself, using the Dark Arts is promoting the very thing we are trying to destroy. It could cause all manners of misconceptions: 1) To the public: that rationalism is akin to sophism or a religion, and that we are not to be trusted; don’t listen to us or else we might trick you into changing your mind. 2) To the people we persuade: that we’re authorities to be trusted and accepted, not ordinary people setting a good example to follow. (Or, even worse: they become rational enough to notice how they were manipulated, but then they do follow the example we were setting, concluding that the Dark Arts are an okay thing to use in general.) 3) To people we fail to persuade: that we’re dangerous liars who must be stopped.
It’s simply the opposite of the kind of epistemology we want to promote. I think the worst part would be the bad example it sets.
Video of killing a cognitive bias with dark arts: http://www.youtube.com/watch?v=haP7Ys9ocTk
(Also illustrates Bongo’s first bullet point in a comment above)
To use the virus metaphor again, this is like a security expert who finds exploits and reports them so they can be fixed. (Or, more closely analogous to this video, working with an individual and delivering an active but harmless virus to their computer so they will become aware of and concerned about the potential for real harm.) The grey area, somehow using invalid but persuasive arguments against people’s actual biases, is like my previous example making a virus that patches the very holes it uses to get in. Using the Dark Arts is like using those exploits to install things on people’s computers on the basis that you’re only using it to install really good software that people ought to have anyway.
So, showing someone by demonstration how a particular thought pattern serves them poorly is not what I’m talking about. That’s a good thing. (I was going to say it’s “not the Dark Arts”, but so we don’t get into arguing about the definition, I’ll just say that this is an example of something I support, while I think I’ve given enough examples of the sort of thing that I don’t support for the distinction to be clear. It is indeed pretty much what Bongo is saying. My point right now is that the two concepts are different enough that we shouldn’t be referring to them with the same term, especially a connotation-heavy one like “Dark Arts”.)
Yes, this is one of the reasons I have serious doubts about global warming.