I find this to be true, but only to a point. Those blind spots in our beliefs are usually subconscious, and so in non-combative discussion they just never come up at all. In combative discussion you find yourself defending them even without consciously realizing why you’re so worried about that part of your belief (something something Belief in Belief).
I almost always find that when I’ve engaged in a combative discussion I’ll update around an hour later, when I notice ways I defended my position that are silly in hindsight.
This is an excellent point, and I too have had this experience.
Very relevant to this are Arthur Schopenhauer’s comments in the introduction to his excellent Die Kunst, Recht zu behalten (usually translated as The Art of Controversy). Schopenhauer comments on people’s vanity, irrationality, stubbornness, and tendency toward rationalization:
If human nature were not base, but thoroughly honourable, we should in every debate have no other aim than the discovery of truth; we should not in the least care whether the truth proved to be in favour of the opinion which we had begun by expressing, or of the opinion of our adversary. That we should regard as a matter of no moment, or, at any rate, of very secondary consequence; but, as things are, it is the main concern. Our innate vanity, which is particularly sensitive in reference to our intellectual powers, will not suffer us to allow that our first position was wrong and our adversary’s right. The way out of this difficulty would be simply to take the trouble always to form a correct judgment. For this a man would have to think before he spoke. But, with most men, innate vanity is accompanied by loquacity and innate dishonesty. They speak before they think; and even though they may afterwards perceive that they are wrong, and that what they assert is false, they want it to seem the contrary. The interest in truth, which may be presumed to have been their only motive when they stated the proposition alleged to be true, now gives way to the interests of vanity: and so, for the sake of vanity, what is true must seem false, and what is false must seem true.
But, says Schopenhauer, these very tendencies may be turned around and harnessed to our service:
However, this very dishonesty, this persistence in a proposition which seems false even to ourselves, has something to be said for it. It often happens that we begin with the firm conviction of the truth of our statement; but our opponent’s argument appears to refute it. Should we abandon our position at once, we may discover later on that we were right after all: the proof we offered was false, but nevertheless there was a proof for our statement which was true. The argument which would have been our salvation did not occur to us at the moment. Hence we make it a rule to attack a counter-argument, even though to all appearances it is true and forcible, in the belief that its truth is only superficial, and that in the course of the dispute another argument will occur to us by which we may upset it, or succeed in confirming the truth of our statement. In this way we are almost compelled to become dishonest; or, at any rate, the temptation to do so is very great. Thus it is that the weakness of our intellect and the perversity of our will lend each other mutual support; and that, generally, a disputant fights not for truth, but for his proposition, as though it were a battle pro aris et focis. He sets to work per fas et nefas; nay, as we have seen, he cannot easily do otherwise. As a rule, then, every man will insist on maintaining whatever he has said, even though for the moment he may consider it false or doubtful.
[emphasis mine]
Schopenhauer is saying that—to put it in modern terms—we do not have the capability to instantly evaluate all arguments put to us, to think in the moment through all their implications, to spot flaws, etc., and to perform exactly the correct update (or lack of update). So if we immediately admit that our interlocutor is right and we are wrong, as soon as this seems to be the case, then we can very easily be led into error!
So we don’t do that. We defend our position, as it stands at the beginning. And then, after the dispute concludes, we can consider the matter at leisure, and quite possibly change our minds.
Schopenhauer further comments that, as far as the rules and “stratagems” of debate (which form the main part of the book)—
In following out the rules to this end, no respect should be paid to objective truth, because we usually do not know where the truth lies. As I have said, a man often does not himself know whether he is in the right or not; he often believes it, and is mistaken: both sides often believe it. Truth is in the depths. At the beginning of a contest each man believes, as a rule, that right is on his side; in the course of it, both become doubtful, and the truth is not determined or confirmed until the close.
(Note the parallel, here, to adversarial collaborations—and recall that in each of the collaborations in Scott’s contest, both sides came out of the experience having moved closer to their opponent/collaborator’s position, despite—or, perhaps, because of?—the process involving a full marshaling of arguments for their own initial view!)
So let us not demand—neither of our interlocutors, nor of ourselves—that a compelling argument be immediately accepted. It may well be that stubborn defense of one’s starting position—combined with a willingness to reflect, after the dispute ends, and to change one’s mind later—is a better path to truth.
I see no apriori reason to think that the average adversarial collaboration was combative in nature. The whole idea was to get the people to collabarate and that collabartion will lead to a good outcome.
Understanding the blind spots of the person you are talking with and bringing them to their awareness is a skill. It might very well be that you are not used to talking to people who have that skill set.
If you follow a procedure like double crux and the person you are talking with have a decent skill-level there’s a good chance that they will point out blind spots to you.
I almost always find that when I’ve engaged in a combative discussion I’ll update around an hour later, when I notice ways I defended my position that are silly in hindsight.
“silly” is a pretty big requirement. It would be better if people don’t need to believe that there old positions are silly to update to be able to do so.
Professional philosophers as a class who’s culture is combative have a bad track record of changing their opinion when confronted with opposing views. Most largely still hold those position that were important to them a decade ago.
In Silicon Valley startup culture there are many people who are able to pivot when it turns out that there initial assumptions were wrong.
Nassim Taleb makes in his books the claim that successful traders are good at changing their opinion when there are goods arguments to change positions.
CFAR went from teaches Bayes rule and Fermi estimation to teaching parts work. A lot of their curriculum manages to change.
I almost always find that when I’ve engaged in a combative discussion I’ll update around an hour later, when I notice ways I defended my position that are silly in hindsight.
I find this to be true, but only to a point. Those blind spots in our beliefs are usually subconscious, and so in non-combative discussion they just never come up at all. In combative discussion you find yourself defending them even without consciously realizing why you’re so worried about that part of your belief (something something Belief in Belief).
I almost always find that when I’ve engaged in a combative discussion I’ll update around an hour later, when I notice ways I defended my position that are silly in hindsight.
This is an excellent point, and I too have had this experience.
Very relevant to this are Arthur Schopenhauer’s comments in the introduction to his excellent Die Kunst, Recht zu behalten (usually translated as The Art of Controversy). Schopenhauer comments on people’s vanity, irrationality, stubbornness, and tendency toward rationalization:
But, says Schopenhauer, these very tendencies may be turned around and harnessed to our service:
[emphasis mine]
Schopenhauer is saying that—to put it in modern terms—we do not have the capability to instantly evaluate all arguments put to us, to think in the moment through all their implications, to spot flaws, etc., and to perform exactly the correct update (or lack of update). So if we immediately admit that our interlocutor is right and we are wrong, as soon as this seems to be the case, then we can very easily be led into error!
So we don’t do that. We defend our position, as it stands at the beginning. And then, after the dispute concludes, we can consider the matter at leisure, and quite possibly change our minds.
Schopenhauer further comments that, as far as the rules and “stratagems” of debate (which form the main part of the book)—
(Note the parallel, here, to adversarial collaborations—and recall that in each of the collaborations in Scott’s contest, both sides came out of the experience having moved closer to their opponent/collaborator’s position, despite—or, perhaps, because of?—the process involving a full marshaling of arguments for their own initial view!)
So let us not demand—neither of our interlocutors, nor of ourselves—that a compelling argument be immediately accepted. It may well be that stubborn defense of one’s starting position—combined with a willingness to reflect, after the dispute ends, and to change one’s mind later—is a better path to truth.
I see no apriori reason to think that the average adversarial collaboration was combative in nature. The whole idea was to get the people to collabarate and that collabartion will lead to a good outcome.
Understanding the blind spots of the person you are talking with and bringing them to their awareness is a skill. It might very well be that you are not used to talking to people who have that skill set.
If you follow a procedure like double crux and the person you are talking with have a decent skill-level there’s a good chance that they will point out blind spots to you.
“silly” is a pretty big requirement. It would be better if people don’t need to believe that there old positions are silly to update to be able to do so.
Professional philosophers as a class who’s culture is combative have a bad track record of changing their opinion when confronted with opposing views. Most largely still hold those position that were important to them a decade ago.
As opposed to whom? Who’s good?
In Silicon Valley startup culture there are many people who are able to pivot when it turns out that there initial assumptions were wrong.
Nassim Taleb makes in his books the claim that successful traders are good at changing their opinion when there are goods arguments to change positions.
CFAR went from teaches Bayes rule and Fermi estimation to teaching parts work. A lot of their curriculum manages to change.
I second that experience.