I cannot say how many arguments I’ve had where this would have prevented hurt feelings. Often, after the argument, I discover the other person persisted in arguing for about 10 minutes after they realized they were wrong, all the while getting more angry at me for shooting down ever worse rationalizations.
To be fair, the way this happens isn’t that the person persists in arguing for something they know to be false; instead, they drop a subtle hint that maybe they might be wrong and we should stop talking about it now (presumably so they can save face). I invariably miss this hint (well, I’m better now that I know to be looking for it, but not a lot) because it’s usually in the form of a ridiculous but hard to disprove objection, to which I (because I’m weird) will come up with a medium-good response. This pisses my interlocutor off, because I missed their social cue, and because I’ve now forced them to defend a belief (their lousy objection) that they don’t actually hold.
This behavior is very understandable; once I noticed others doing it, I noticed a tendency in myself. It’s surprisingly hard to say, “Oops, I guess I’m wrong,” or, “I can’t see a good counter argument to what you’re saying; maybe I need to reconsider.”
Anyway, I’m saying this because the article linked by the quoted phrase wasn’t quite what I was hoping for on the subject. :)
Yes, a big problem is the human tendency to associate strongly with beliefs so that they become a part of your identity. When I once got into an argument with a particularly stubborn friend regarding religion, I tried to disassociate arguments as much as possible by writing them down and having an impartial 3rd party check for inconsistencies and biases blindly in a type of scoring system. How’d it turn out? He gave up alright, but still retained his beliefs!
a big problem is the human tendency to associate strongly with beliefs so that they become a part of your identity.
This is true. Another thing that make such arguments difficult/pointless is that it seems the majority of people give rationalizations for what they believe instead of giving the reasons for which they actually obtained those beliefs. This is understandable as people often don’t know (or won’t admit to) the way they obtained their beliefs. If one doesn’t know/won’t admit/won’t say why they really think what they do, there’s no possible way to present a counter-argument against it. I think there’s a post saying pretty much this around here somewhere.
People seem to rarely admit they are wrong, especially on important issues. I tend to think the root cause of this problem is status preservation, not bias. See this. Confirmation bias exacerbates this when people try to flesh out a consistent worldview that incorporates the thing they couldn’t lose face by denying.
I cannot say how many arguments I’ve had where this would have prevented hurt feelings. Often, after the argument, I discover the other person persisted in arguing for about 10 minutes after they realized they were wrong, all the while getting more angry at me for shooting down ever worse rationalizations.
To be fair, the way this happens isn’t that the person persists in arguing for something they know to be false; instead, they drop a subtle hint that maybe they might be wrong and we should stop talking about it now (presumably so they can save face). I invariably miss this hint (well, I’m better now that I know to be looking for it, but not a lot) because it’s usually in the form of a ridiculous but hard to disprove objection, to which I (because I’m weird) will come up with a medium-good response. This pisses my interlocutor off, because I missed their social cue, and because I’ve now forced them to defend a belief (their lousy objection) that they don’t actually hold.
This behavior is very understandable; once I noticed others doing it, I noticed a tendency in myself. It’s surprisingly hard to say, “Oops, I guess I’m wrong,” or, “I can’t see a good counter argument to what you’re saying; maybe I need to reconsider.”
Anyway, I’m saying this because the article linked by the quoted phrase wasn’t quite what I was hoping for on the subject. :)
Yes, a big problem is the human tendency to associate strongly with beliefs so that they become a part of your identity. When I once got into an argument with a particularly stubborn friend regarding religion, I tried to disassociate arguments as much as possible by writing them down and having an impartial 3rd party check for inconsistencies and biases blindly in a type of scoring system. How’d it turn out? He gave up alright, but still retained his beliefs!
This is true. Another thing that make such arguments difficult/pointless is that it seems the majority of people give rationalizations for what they believe instead of giving the reasons for which they actually obtained those beliefs. This is understandable as people often don’t know (or won’t admit to) the way they obtained their beliefs. If one doesn’t know/won’t admit/won’t say why they really think what they do, there’s no possible way to present a counter-argument against it. I think there’s a post saying pretty much this around here somewhere.
Often, the real reason one believes something is simply “My parents (or other trusted authority figure) told me it’s true.”
People seem to rarely admit they are wrong, especially on important issues. I tend to think the root cause of this problem is status preservation, not bias. See this. Confirmation bias exacerbates this when people try to flesh out a consistent worldview that incorporates the thing they couldn’t lose face by denying.