I trust in our ability to think up ways of avoiding this. The same way Jiu Jitsu, a lethal, murderous martial art, was redeveloped into Judo, a far less lethal version, I think we can use our sense of rationality in such a way as to develop an art of debate that doesn’t need to rely on the Dark Arts to persuade the audience (and, secretly, the opponent). Note that being a dangerous Judoka takes much more training and effort than becoming a dangerous Jiu Jitsuka, but the end result is much more beautiful to behold, yes? (And causes much less trouble in the long run, especially before the sort of audience that will judge you on how “nobly” you won rather than how quickly, to the point of making even a defeat on the field a victory among the public, Rocky-style).
This includes sparring techniques and training methods: training a man to kill takes very different methods from training them to lawfully win street fights. The wrong training in the wrong circumstances can prove to be disastrous, such as what happens when you send combat troops do the job of MP.
I trust in our ability to think up ways of avoiding this. The same way Jiu Jitsu, a lethal, murderous martial art, was redeveloped into Judo, a far less lethal version, I think we can use our sense of rationality in such a way as to develop an art of debate that doesn’t need to rely on the Dark Arts to persuade the audience
I don’t trust anyone’s ability to avoid this pitfall. Rationality skills are often learned on the 5-second level, so it’s paramount that we train ourselves to not instinctively rationalize things. In general, when persuation takes precedence over truth-seeking, then you’re no longer talking about rationality, you’re just talking about how to win debates. I agree that there are white-hat ways to debate and black-hat ways to debate and that public speaking is a valuable skill, but rationality is never about arguing for a particular conclusion. As a result, I think it’s best to keep the two concepts separate and be very mindful of which thinking skills you are using for each activity.
In general, when persuation takes precedence over truth-seeking, then you’re no longer talking about rationality, you’re just talking about how to win debates.
As long as both participants really want truth and are willing to lose, I don’t think this will be a problems. When following a chain of logic by yourself, you are the only one who can notice your rationalizations. Chavrutas might provide extra incentive to rationalize but for some people, especially many lesswrongers, this will be outweighed by having an extra guard against rationalization.
Chavrutas might provide extra incentive to rationalize but for some people, especially many lesswrongers, this will be outweighed by having an extra guard against rationalization.
I’m somewhat skeptical that LessWrong readers are so good at avoiding rationalization that we can engage in activities like this without any adverse cognitive effects. It’s not like non-LWers are running Windows ME and we are running LINUX—we all share a similar cognitive architecture, and we are all prone to rationalization. Being aware of this fact is not enough to prevent it, just like seeing an optical illusion does not make it go away. It takes a conscious effort to think rationally, and I think we should be focusing our efforts on developing epistemic rigor rather than engaging in something potentially poisonous.
Furthermore, while having a partner in a calm, reasoned discussion will help you catch yourself in the act of rationalizing, having a partner whose purpose is to argue with you probably won’t.
I have a very different impression of the mental consequences of debating than you; I didn’t think that debating always provides a very strong incentive to rationalize, depending on context, the relationship between the participants, etc. Am I incorrect?
Well, I can tell you about my own experience: I participated in organized debate in high school and at university for a total of 6 years. After a while, rationalization came naturally, and I couldn’t tell the difference between rationalization and non-rationalization. In early 2011 I started re-reading the Core Sequences (I read them for the first time in mid-2010), and some of the posts on rationalization really “clicked” the second time around. I gradually realized that I had to make a deliberate effort to not rationalize, and I tasked myself with double-checking as many of my own thoughts as possible. Since then I’ve improved somewhat, and I can sometimes catch myself in the act of rationalizing. But I’ve got a long way to go, and I think that’s partly a result of training myself to accept a randomly assigned conclusion and manufacture arguments to match.
I agree that some kinds of debates do discourage rationalization, but I’m worried that as soon you give up your bottom line, you put yourself at a great deal of epistemic risk.
I agree that some kinds of debates do discourage rationalization, but I’m worried that as soon you give up your bottom line, you put yourself at a great deal of epistemic risk.
The kind of debate one would use with a rationalist chavruta should be defending a point that you actually believe using the reasons that actually cause you to believe it. Maybe the word ‘debate’ has the wrong connotations, especially in the context of debate clubs, which are epistemically horrible, but I think humans can try to convince each other of things without rationalizing.
The kind of debate one would use with a rationalist chavruta should be defending a point that you actually believe using the reasons that actually cause you to believe it.
I completely agree; though there would still be rationalization for social reasons, as JoshuaZ points out below, if both partners were being completely honest with each other then there is significantly less cause for concern.
if both partners were being completely honest with each other then there is significantly less cause for concern.
I’m not quite sure what you mean here. Do you think that many LWers could maintain a high enough level of honesty for this to work? I think that we could, though some people are more competitive and less truth-seeking in these types of scenarios, so I wouldn’t recommend it for everyone.
As long as there is nothing incentivizing people to be anything other than completely honest (like there is in an organized debate) then I’m much less concerned (but not entirely unconcerned). And I agree that not everyone is capable of being entirely honest.
We don’t always know our true rejections; sometimes we even invent rejections during the debate. This is a bias that should be reduced as much as possible, but it can’t always be eliminated.
I participated in organized debate for some time and was quite good at it. I am not convinced that I learned anything other than sophistry and selective reasoning from the experience. Worse, I’m not convinced that that wasn’t the point.
If at least some of the chavruta arguments are public, then the odds go up that weak arguments will be recognized for what they are.
“Jews and Gentiles” puts the dividing line in the wrong place. I’m Jewish, but I know little about how Talmud is studied. I’d never heard of chavrutas till your post.
I trust in our ability to think up ways of avoiding this. The same way Jiu Jitsu, a lethal, murderous martial art, was redeveloped into Judo, a far less lethal version, I think we can use our sense of rationality in such a way as to develop an art of debate that doesn’t need to rely on the Dark Arts to persuade the audience (and, secretly, the opponent). Note that being a dangerous Judoka takes much more training and effort than becoming a dangerous Jiu Jitsuka, but the end result is much more beautiful to behold, yes? (And causes much less trouble in the long run, especially before the sort of audience that will judge you on how “nobly” you won rather than how quickly, to the point of making even a defeat on the field a victory among the public, Rocky-style).
This includes sparring techniques and training methods: training a man to kill takes very different methods from training them to lawfully win street fights. The wrong training in the wrong circumstances can prove to be disastrous, such as what happens when you send combat troops do the job of MP.
I don’t trust anyone’s ability to avoid this pitfall. Rationality skills are often learned on the 5-second level, so it’s paramount that we train ourselves to not instinctively rationalize things. In general, when persuation takes precedence over truth-seeking, then you’re no longer talking about rationality, you’re just talking about how to win debates. I agree that there are white-hat ways to debate and black-hat ways to debate and that public speaking is a valuable skill, but rationality is never about arguing for a particular conclusion. As a result, I think it’s best to keep the two concepts separate and be very mindful of which thinking skills you are using for each activity.
Our ability as a group. See, you yourself are already contributing into averting those problems.
As long as both participants really want truth and are willing to lose, I don’t think this will be a problems. When following a chain of logic by yourself, you are the only one who can notice your rationalizations. Chavrutas might provide extra incentive to rationalize but for some people, especially many lesswrongers, this will be outweighed by having an extra guard against rationalization.
I’m somewhat skeptical that LessWrong readers are so good at avoiding rationalization that we can engage in activities like this without any adverse cognitive effects. It’s not like non-LWers are running Windows ME and we are running LINUX—we all share a similar cognitive architecture, and we are all prone to rationalization. Being aware of this fact is not enough to prevent it, just like seeing an optical illusion does not make it go away. It takes a conscious effort to think rationally, and I think we should be focusing our efforts on developing epistemic rigor rather than engaging in something potentially poisonous.
Furthermore, while having a partner in a calm, reasoned discussion will help you catch yourself in the act of rationalizing, having a partner whose purpose is to argue with you probably won’t.
I have a very different impression of the mental consequences of debating than you; I didn’t think that debating always provides a very strong incentive to rationalize, depending on context, the relationship between the participants, etc. Am I incorrect?
Well, I can tell you about my own experience: I participated in organized debate in high school and at university for a total of 6 years. After a while, rationalization came naturally, and I couldn’t tell the difference between rationalization and non-rationalization. In early 2011 I started re-reading the Core Sequences (I read them for the first time in mid-2010), and some of the posts on rationalization really “clicked” the second time around. I gradually realized that I had to make a deliberate effort to not rationalize, and I tasked myself with double-checking as many of my own thoughts as possible. Since then I’ve improved somewhat, and I can sometimes catch myself in the act of rationalizing. But I’ve got a long way to go, and I think that’s partly a result of training myself to accept a randomly assigned conclusion and manufacture arguments to match.
I agree that some kinds of debates do discourage rationalization, but I’m worried that as soon you give up your bottom line, you put yourself at a great deal of epistemic risk.
The kind of debate one would use with a rationalist chavruta should be defending a point that you actually believe using the reasons that actually cause you to believe it. Maybe the word ‘debate’ has the wrong connotations, especially in the context of debate clubs, which are epistemically horrible, but I think humans can try to convince each other of things without rationalizing.
I completely agree; though there would still be rationalization for social reasons, as JoshuaZ points out below, if both partners were being completely honest with each other then there is significantly less cause for concern.
I’m not quite sure what you mean here. Do you think that many LWers could maintain a high enough level of honesty for this to work? I think that we could, though some people are more competitive and less truth-seeking in these types of scenarios, so I wouldn’t recommend it for everyone.
As long as there is nothing incentivizing people to be anything other than completely honest (like there is in an organized debate) then I’m much less concerned (but not entirely unconcerned). And I agree that not everyone is capable of being entirely honest.
If we can’t be entirly honest, then what are we doing here?
We don’t always know our true rejections; sometimes we even invent rejections during the debate. This is a bias that should be reduced as much as possible, but it can’t always be eliminated.
I participated in organized debate for some time and was quite good at it. I am not convinced that I learned anything other than sophistry and selective reasoning from the experience. Worse, I’m not convinced that that wasn’t the point.
If at least some of the chavruta arguments are public, then the odds go up that weak arguments will be recognized for what they are.
“Jews and Gentiles” puts the dividing line in the wrong place. I’m Jewish, but I know little about how Talmud is studied. I’d never heard of chavrutas till your post.
… Yup, that basically sums up my perception of it, superficially at least.