Bad, bad idea. There’s no way to avoid losing an argument, because most of the time arguments are social wrestling contests / displays of influence and status.
The only thing you can do is to always make sure you’re supporting the right side. That doesn’t guarantee that you don’t lose, if losing is defined as not coming out as the social victor and failing to convince your opponent or your listeners.
You can’t control the responses of others. You can force them to be rational. All you can do is be correct.
As a rationalist, that’s all you should really care about anyway. But precious few of us are rationalists.
As a rationalist, that’s all you should really care about anyway.
Surely you don’t mean that simpliciter. There are other things that one should care about, perhaps even while wearing a ‘rationalist’ hat. It seems that being able to argue in a way that supports the truth without alienating one’s friends seems like a worthwhile endeavor.
No person who chooses his friends over speaking the truth is a rationalist.
Then rationalists are terribly vicious people.
Honesty is a virtue. Too much concern for truth-telling is a vice (one might call such person a ‘stickler’).
This post does not ask you to pursue friendship at the expense of learning the truth. It instead suggests a way of helping other people come to the truth, in a way that can advance friendship.
A commitment to epistemic rationality means taking maximum care that one’s own beliefs are correct. It doesn’t say anything about what one should say to others.
Once again, we are running into the problem of the term “rational” referring to at least two different concepts.
1. Epistemic rationality: The map should reflect the terroritory. Truth above all else.2. Instrumental rationality: You should win. My values are such that making my friends happy is a form of winning.
You can’t expect to achieve your goals unless you can match options with outcomes. How we define ‘winning’ is itself something that’s determined by our goals, and reality determines which goals are self-compatible.
Eliezer, that’s not what honesty means. Honesty is a term of virtue. It represents the appropriate amount of concern for truth-telling. Here’s a thought experiment:
Let’s consider two hypothetical people, Harry and Stan. Harry and Stan are both very honest people—they do their best to never lie or otherwise deceive people.
Suppose that Anne Frank is hiding in the attic, and the Nazis come asking if she’s there. Harry doesn’t want to tell them, but Stan insists he mustn’t deceive the Nazis, regardless of his commitment to save Anne’s life.
If Harry says “I don’t know where Anne is”, then he’s lying, and thus less honest than Stan.
If Harry says “I don’t want to tell you where Anne is” then he’s not stating a falsehood, nor “deceiving” (in any sense that I can think of) the Nazis, so would probably be “comparably” if not “equally” honest as Stan.
If this seems to conflict with your intuitions of ethics, it may be because you associate “ethical goodness” with “honesty”, whereas I consider “honesty” to be merely “usually ethically good” in most cases, but in the fully general case a “ethically neutral” concept.
edit: Fixed typo where I forgot to add the word “not”.
But if Harry says “I don’t want to tell you where Anne is”, he will arouse the suspicion of the Nazis, who will search his house and find Anne.
In this thought experiment, the price Stan pays to maintain his code of honesty is one human life. What benefits accrue from his code that make such sacrifices worthwhile?
Stan isn’t a consequentialist; he considers himself to have done the right thing even though there are alternatives that would overall have better consequences, and so he won’t look for benefits to justify his decision.
I think the world needs more Harrys and fewer Stans; or at least, I think that would have better consequences, and that’s what I value.
It is an empirical truth that people tend to become more like what they pretend to be.
You can’t pretend to agree with a counter-rational position, or believe in counter-rational arguments, without degrading your own rationality. Whether it’s theoretically possible for minds to exist such that this does not occur doesn’t matter. Our minds are structured so that it does.
Furthermore, I am highly skeptical of the position that despite valuing rationality enough to stick to it even in the face of inconvenience and countermotivations, rationalists could plausibly value the inaccuracy of people they supposedly care about.
Bad, bad idea. There’s no way to avoid losing an argument, because most of the time arguments are social wrestling contests / displays of influence and status.
The only thing you can do is to always make sure you’re supporting the right side. That doesn’t guarantee that you don’t lose, if losing is defined as not coming out as the social victor and failing to convince your opponent or your listeners.
You can’t control the responses of others. You can force them to be rational. All you can do is be correct.
As a rationalist, that’s all you should really care about anyway. But precious few of us are rationalists.
Surely you don’t mean that simpliciter. There are other things that one should care about, perhaps even while wearing a ‘rationalist’ hat. It seems that being able to argue in a way that supports the truth without alienating one’s friends seems like a worthwhile endeavor.
No person who chooses his friends over speaking the truth is a rationalist.
Everybody has their priorities. Rationalists can have only a limited subset of the possibilities.
Then rationalists are terribly vicious people.
Honesty is a virtue. Too much concern for truth-telling is a vice (one might call such person a ‘stickler’).
This post does not ask you to pursue friendship at the expense of learning the truth. It instead suggests a way of helping other people come to the truth, in a way that can advance friendship.
Once again, we are running into the problem of the term “rational” referring to at least two different concepts.
Epistemic rationality: The map should reflect the terroritory. Truth above all else.
Instrumental rationality: You should win. My values are such that making my friends happy is a form of winning.
A commitment to epistemic rationality means taking maximum care that one’s own beliefs are correct. It doesn’t say anything about what one should say to others.
Once again, we are running into the problem of the term “rational” referring to at least two different concepts.
1. Epistemic rationality: The map should reflect the terroritory. Truth above all else. 2. Instrumental rationality: You should win. My values are such that making my friends happy is a form of winning.
You can’t expect to achieve your goals unless you can match options with outcomes. How we define ‘winning’ is itself something that’s determined by our goals, and reality determines which goals are self-compatible.
& 3. Honesty: Always speak the whole truth as you know it to everyone.
Eliezer, that’s not what honesty means. Honesty is a term of virtue. It represents the appropriate amount of concern for truth-telling. Here’s a thought experiment:
Let’s consider two hypothetical people, Harry and Stan. Harry and Stan are both very honest people—they do their best to never lie or otherwise deceive people.
Suppose that Anne Frank is hiding in the attic, and the Nazis come asking if she’s there. Harry doesn’t want to tell them, but Stan insists he mustn’t deceive the Nazis, regardless of his commitment to save Anne’s life.
Now, is Stan therefore more honest than Harry?
Depends on what Harry says.
If Harry says “I don’t know where Anne is”, then he’s lying, and thus less honest than Stan.
If Harry says “I don’t want to tell you where Anne is” then he’s not stating a falsehood, nor “deceiving” (in any sense that I can think of) the Nazis, so would probably be “comparably” if not “equally” honest as Stan.
If this seems to conflict with your intuitions of ethics, it may be because you associate “ethical goodness” with “honesty”, whereas I consider “honesty” to be merely “usually ethically good” in most cases, but in the fully general case a “ethically neutral” concept.
edit: Fixed typo where I forgot to add the word “not”.
But if Harry says “I don’t want to tell you where Anne is”, he will arouse the suspicion of the Nazis, who will search his house and find Anne.
In this thought experiment, the price Stan pays to maintain his code of honesty is one human life. What benefits accrue from his code that make such sacrifices worthwhile?
Stan isn’t a consequentialist; he considers himself to have done the right thing even though there are alternatives that would overall have better consequences, and so he won’t look for benefits to justify his decision.
I think the world needs more Harrys and fewer Stans; or at least, I think that would have better consequences, and that’s what I value.
It is an empirical truth that people tend to become more like what they pretend to be.
You can’t pretend to agree with a counter-rational position, or believe in counter-rational arguments, without degrading your own rationality. Whether it’s theoretically possible for minds to exist such that this does not occur doesn’t matter. Our minds are structured so that it does.
Furthermore, I am highly skeptical of the position that despite valuing rationality enough to stick to it even in the face of inconvenience and countermotivations, rationalists could plausibly value the inaccuracy of people they supposedly care about.