People rarely change their mind when they feel like you have trapped them in some inconsistency [...] In general (but not universally) it is more productive to adopt a collaborative attitude of sincerely trying to help a person articulate, clarify, and substantiate [bolding mine—ZMD]
“People” in general rarely change their mind when they feel like you have trapped them in some inconsistency, but people using the double-crux method in the first place are going to be aspiring rationalists, right? Trapping someone in an inconsistency (if it’s a real inconsistency and not a false perception of one) is collaborative: the thing they were thinking was flawed, and you helped them see the flaw! That’s a good thing! (As it is written of the fifth virtue, “Do not believe you do others a favor if you accept their arguments; the favor is to you.”)
Obviously, I agree that people should try to understand their interlocutors. (If you performatively try to find fault in something you don’t understand, then apparent “faults” you find are likely to be your own misunderstandings rather than actual faults.) But if someone spots an actual inconsistency in my ideas, I want them to tell me right away. Performing the behavior of trying to substantiate something that cannot, in fact, be substantiated (because it contains an inconsistency) is a waste of everyone’s time!
In general (but not universally) it is more productive to adopt a collaborative attitude
Can you say more about what you think the exceptions to the general-but-not-universal rule are? (Um, specifically.)
I would think that inconsistencies are easier to appriciate when they are in the central machinery. A rationalist might have more load bearing on their beliefs so most beliefs are central to atleast something but I think a centrality/point-of-communication check is more upside than downside to keep. Also cognitive time spent looking for inconsistencies could be better spent on more constructive activities. Then there is the whole class of heuristics which don’t even claim to be consistent. So the ability to pass by an inconsistency without hanging onto it will see use.
“People” in general rarely change their mind when they feel like you have trapped them in some inconsistency, but people using the double-crux method in the first place are going to be aspiring rationalists, right? Trapping someone in an inconsistency (if it’s a real inconsistency and not a false perception of one) is collaborative: the thing they were thinking was flawed, and you helped them see the flaw! That’s a good thing! (As it is written of the fifth virtue, “Do not believe you do others a favor if you accept their arguments; the favor is to you.”)
Obviously, I agree that people should try to understand their interlocutors. (If you performatively try to find fault in something you don’t understand, then apparent “faults” you find are likely to be your own misunderstandings rather than actual faults.) But if someone spots an actual inconsistency in my ideas, I want them to tell me right away. Performing the behavior of trying to substantiate something that cannot, in fact, be substantiated (because it contains an inconsistency) is a waste of everyone’s time!
Can you say more about what you think the exceptions to the general-but-not-universal rule are? (Um, specifically.)
I would think that inconsistencies are easier to appriciate when they are in the central machinery. A rationalist might have more load bearing on their beliefs so most beliefs are central to atleast something but I think a centrality/point-of-communication check is more upside than downside to keep. Also cognitive time spent looking for inconsistencies could be better spent on more constructive activities. Then there is the whole class of heuristics which don’t even claim to be consistent. So the ability to pass by an inconsistency without hanging onto it will see use.