I worry this post will be dismissed as trivial. I simultaneously worry that, even with the above disclaimer, someone is going to respond, “Chris admits to thinking lying is often okay, now we can’t trust anything he says!”
If you extract the hyperbole this is an entirely valid reasoning. An observed pattern of lies (or an outright declaration of such a pattern) does mean that people should trust everything you say somewhat less than they otherwise would. Reputation matters. Expecting people to trust your word as much when you lie to them as when you don’t would be foolish. This is a tradeoff that seems worthwhile but you must acknowledge that it is a tradeoff.
If you’re thinking of saying that, that’s your problem, not mine
False. It is their problem and yours. People not believing you is obviously a negative consequence to you. Acknowledge it and choose to accept the negative consequence anyway because of the other benefits you get from lies. (Or, I suppose, you could use selective epistemic irrationality as a dominance move and as the typical way to defect on an ultimatum game. Whatever works.)
all but the most prolific liars don’t lie anything like half the time, so what they say is still significant evidence, most of the time.
With the caveat that the ‘most of the time’ excludes all the time when it matters to them most. Assuming a vaguely rational liar the times when they should be least trusted are times when being believed would benefit them the most.
An observed pattern of lies (or an outright declaration of such a pattern) does mean that people should trust everything you say somewhat less than they otherwise would.
Really? Someone saying “I do the socially normal thing with white lies” is reason to distrust what they say about science?
Saying “I do the socially normal thing” is pretty good evidence that you don’t do the socially normal thing.
In a sense, yes. Normally you don’t announce you do the socially normal thing. But when you’re in a subculture where lots of people don’t do the socially normal thing...
Agreed. Very few of the positions on lying taken in this thread could be classified as “socially normal” outside of (or in a number of cases even inside of) LW-associated circles.
Someone saying “I do the socially normal thing with white lies” is reason to distrust what they say about science?
(I question the claim that this is merely an expression of normality but assume it for the sake of the answer.)
Yes, it is a reason to trust what they say about science less. The “socially normal” thing to do with respect to mentioning science is to be much more inclined to bring up findings that support one’s own preferred objectives than to bring up other things. It also involves a tendency to frame the science in the most personally favourable light.
An above normal obsession with epistemic accuracy and truthfulness (which is somewhat typical of people more intellectually inclined and more interested in science) ought to (all else being equal) make one more comfortable trusting someone talking about science. I, for example, often can’t help making references to findings and arguing against positions that could be considered “my side”. That political naivety and epistemic honesty at the expense of agenda is some degree of evidence. Possibly evidence that I can’t be trusted as a political ally on the social-perceptions battlefield but that I can be more useful as a raw information source.
Again, assume “all else being equal” is included in every second sentence above.
To some extent, though probably not to a large extent.
An older version of my recent article about trust used to have the following paragraphs, which I then cut since the essay was already long enough:
One of my friends, who I know to be good at telling lies, was recently offended when I admitted that the possibility of them lying to me has sometimes crossed my mind. They protested that they only lie in some very specific situations that force them to lie, and that they’ve never had any reason to lie to me. I think the underlying idea was something like, since they’ve always had a good reason to lie to others, that shouldn’t be counted against them. I tried to explain that I don’t hold it against them, but that doesn’t mean that I could just forget about it, either. Means, motive, opportunity: they’ve demonstrated that they have the means, that while they dislike lying they’re not absolutely opposed to it, and there would certainly have been plenty of opportunities for them to lie to me. And while I cannot imagine a reason for them to lie to me, I also don’t have a full understanding of how their mind works, so I must take into account the possibility of something unforeseen.
None of this causes me to assign “they will lie to me” a very high probability, which is why the thought only crossed my mind without actually causing me to worry.
My views on lying are similar to your friend’s. Thanks for having a charitable reaction.
After reading some of the attitudes in this thread, I find it disconcerting to think that a friend might suddenly view me as having inscrutable or dangerous psychology, if they found out that I believe in white lies in limited situations, like the vast majority of humans. It’s distressing that upon finding this out, that they might so confused about my ethics or behavior patterns… even though presumably, since they were friends with me, they had a positive impression of my ethics and behavior before.
Maybe finding out that a friend is willing to lie causes you to change your picture of their ethics (rhetorical “you”). But why is it news that they lie sometimes? The vast majority of people do. Typical human is typical.
Maybe the worry is that if you don’t know the criteria by which your friends lie, then they might lie to you without you expecting it.
If so, then perhaps there are ways to improve your theory of mind regarding your friends, and then avoid being deceived. You could ask your friends about their beliefs about ethics, or try to discover common reasons or principles behind “white lies.” While people vary on their beliefs about lying, there is probably a lot of intersubjectivity. Just because someone isn’t aware of intersubjective beliefs about the acceptability of lying, it doesn’t mean that their neurotypical friends are capricious about lying. (Of course, if future evidence shows that everyone lies in completely unpredictable ways, then I would change my view.)
For example, if you know that your friend lies in response to compliment-fishing, then you can avoid fishing for compliments from them, or discount their answers if you do. If you know that your friend lies to people he believes are trying to exploit him, then you don’t need to be worried about him lying to you, unless (a) you plan on exploiting him, or (b) you worry that he might think that you are exploiting him even if you aren’t, and he lies rather than notify you.
If that’s the case, then the real worry should be that your friend might feel antagonized by you without you realizing it and without him being able to talk to you about it. As long as you have good reasons to believe that you won’t have conflict with your friend, or work it out if conflict occurs, then your friend lying for adversarial reasons is probably not likely.
Just because your friends don’t give you (rhetorical “you”) an exhaustive list of the situations where they might lie, or a formalized set of principles, it doesn’t mean that you are in the dark about when they might lie, unless your theory of their mind leaves you in the dark about their behavior in general.
Means, motive, opportunity: they’ve demonstrated that they have the means, that while they dislike lying they’re not absolutely opposed to it, and there would certainly have been plenty of opportunities for them to lie to me. And while I cannot imagine a reason for them to lie to me, I also don’t have a full understanding of how their mind works, so I must take into account the possibility of something unforeseen.
As you correctly observe in your excellent trust post, unforeseen circumstances are always a possibility in relationships. I think your post leads to the conclusion that trusting a person is related to your theory of mind regarding them.
Never-lies vs believes-that-at-least-some-lies-are-justified is probably not a very useful way to reduce unforeseen conflict. Someone who says that they “never lie” could have a different definition of “lies” than you. They might be very good at telling the literal truth in deceptive way. They might change their ethical view of lying without telling you. They might lie accidentally. Or they might be lying that they “never lie,” or they may be speaking figuratively (and mean “I never lie about the important stuff”).
The most useful distinctions between people is not if they will lie, but when. Predicting when your friends might lie is not just a function of your friends behavior, it’s also a function of your theory of mind.
If you extract the hyperbole this is an entirely valid reasoning. An observed pattern of lies (or an outright declaration of such a pattern) does mean that people should trust everything you say somewhat less than they otherwise would. Reputation matters. Expecting people to trust your word as much when you lie to them as when you don’t would be foolish. This is a tradeoff that seems worthwhile but you must acknowledge that it is a tradeoff.
False. It is their problem and yours. People not believing you is obviously a negative consequence to you. Acknowledge it and choose to accept the negative consequence anyway because of the other benefits you get from lies. (Or, I suppose, you could use selective epistemic irrationality as a dominance move and as the typical way to defect on an ultimatum game. Whatever works.)
With the caveat that the ‘most of the time’ excludes all the time when it matters to them most. Assuming a vaguely rational liar the times when they should be least trusted are times when being believed would benefit them the most.
Really? Someone saying “I do the socially normal thing with white lies” is reason to distrust what they say about science?
Saying “I do the socially normal thing” is pretty good evidence that you don’t do the socially normal thing.
Structurally, this post and its comments are extremely similar to the pua threads.
In a sense, yes. Normally you don’t announce you do the socially normal thing. But when you’re in a subculture where lots of people don’t do the socially normal thing...
Agreed. Very few of the positions on lying taken in this thread could be classified as “socially normal” outside of (or in a number of cases even inside of) LW-associated circles.
Yes.
(I question the claim that this is merely an expression of normality but assume it for the sake of the answer.)
Yes, it is a reason to trust what they say about science less. The “socially normal” thing to do with respect to mentioning science is to be much more inclined to bring up findings that support one’s own preferred objectives than to bring up other things. It also involves a tendency to frame the science in the most personally favourable light.
An above normal obsession with epistemic accuracy and truthfulness (which is somewhat typical of people more intellectually inclined and more interested in science) ought to (all else being equal) make one more comfortable trusting someone talking about science. I, for example, often can’t help making references to findings and arguing against positions that could be considered “my side”. That political naivety and epistemic honesty at the expense of agenda is some degree of evidence. Possibly evidence that I can’t be trusted as a political ally on the social-perceptions battlefield but that I can be more useful as a raw information source.
Again, assume “all else being equal” is included in every second sentence above.
To some extent, though probably not to a large extent.
An older version of my recent article about trust used to have the following paragraphs, which I then cut since the essay was already long enough:
My views on lying are similar to your friend’s. Thanks for having a charitable reaction.
After reading some of the attitudes in this thread, I find it disconcerting to think that a friend might suddenly view me as having inscrutable or dangerous psychology, if they found out that I believe in white lies in limited situations, like the vast majority of humans. It’s distressing that upon finding this out, that they might so confused about my ethics or behavior patterns… even though presumably, since they were friends with me, they had a positive impression of my ethics and behavior before.
Maybe finding out that a friend is willing to lie causes you to change your picture of their ethics (rhetorical “you”). But why is it news that they lie sometimes? The vast majority of people do. Typical human is typical.
Maybe the worry is that if you don’t know the criteria by which your friends lie, then they might lie to you without you expecting it.
If so, then perhaps there are ways to improve your theory of mind regarding your friends, and then avoid being deceived. You could ask your friends about their beliefs about ethics, or try to discover common reasons or principles behind “white lies.” While people vary on their beliefs about lying, there is probably a lot of intersubjectivity. Just because someone isn’t aware of intersubjective beliefs about the acceptability of lying, it doesn’t mean that their neurotypical friends are capricious about lying. (Of course, if future evidence shows that everyone lies in completely unpredictable ways, then I would change my view.)
For example, if you know that your friend lies in response to compliment-fishing, then you can avoid fishing for compliments from them, or discount their answers if you do. If you know that your friend lies to people he believes are trying to exploit him, then you don’t need to be worried about him lying to you, unless (a) you plan on exploiting him, or (b) you worry that he might think that you are exploiting him even if you aren’t, and he lies rather than notify you.
If that’s the case, then the real worry should be that your friend might feel antagonized by you without you realizing it and without him being able to talk to you about it. As long as you have good reasons to believe that you won’t have conflict with your friend, or work it out if conflict occurs, then your friend lying for adversarial reasons is probably not likely.
Just because your friends don’t give you (rhetorical “you”) an exhaustive list of the situations where they might lie, or a formalized set of principles, it doesn’t mean that you are in the dark about when they might lie, unless your theory of their mind leaves you in the dark about their behavior in general.
As you correctly observe in your excellent trust post, unforeseen circumstances are always a possibility in relationships. I think your post leads to the conclusion that trusting a person is related to your theory of mind regarding them.
Never-lies vs believes-that-at-least-some-lies-are-justified is probably not a very useful way to reduce unforeseen conflict. Someone who says that they “never lie” could have a different definition of “lies” than you. They might be very good at telling the literal truth in deceptive way. They might change their ethical view of lying without telling you. They might lie accidentally. Or they might be lying that they “never lie,” or they may be speaking figuratively (and mean “I never lie about the important stuff”).
The most useful distinctions between people is not if they will lie, but when. Predicting when your friends might lie is not just a function of your friends behavior, it’s also a function of your theory of mind.