Hmm. It occurs to me that lying might be a domain that’s particularly prone to typical mind fallacy because people rarely share information about their lying habits. (See “Typical Sex Life Fallacy”)
Some categories of experiences I can recall, which I think fall on a spectrum of deliberateness to unconsciousness.
Lying while surprised.
As a teenager, my dad suddenly asked me “have you done Forbidden Activity?” at a time when I wasn’t expecting it. “Oh shit,” I thought. I said “no.”
[in this case, I was explicitly not epistemically cooperating with my dad. My understanding from Blatant Lying is the Best Kind is that this was simulacrum 2 behavior]
Rehearsing a narrative
Perhaps most similar to your experience: talking to a prospective employer at an interview, and realizing they’re about to ask me about X and the truest answer to X is pretty unflattering to myself. Rehearsing a narrative in my head to prepare for that moment, trying to come up with a story that’s true-ish enough that I can justify it to myself, so that by the time they ask me about X I can bullshit my way through it fluently.
[This seemed mostly lying playing a simulacrum 4 game with fairly established rules about what is acceptable]
Reflexing lying that’s easy to notice
If someone asks “does this dress make me look fat” and I say “no you look great!”, or someone asks “how’s your project coming along” and I say “great!”, and no she doesn’t look great and/or my project is not going great, it’s usually obvious to me almost immediately, even if I believed it (or wasn’t really paying attention one way or another) at the moment that I said “great!”.
This feels on the edge of the lying/motivated-cognition spectrum, and seems reasonable to me classify it as a lie.
Even if the first instance was unconscious, if the conversation continues about how my project is going, subsequent statements are probably deliberate S2 lies, or there is clear, continuous S2 thinking about how to maintain the “things are great!” narrative.
[where this falls on the simulacrum spectrum depends a bit on context, I could see it being level 3 or level 4]
Reflexively bad arguments
Sometimes someone says “Policy X is terrible!” and I think “no, policy X is good! Without policy X is entire project is doomed!”. And, well, I do think without Policy X, the project is going to be much harder and failure is more likely. But the statement was clearly politically motivated. “My preferred policy is absolutely essential” probably isn’t true.
A few years ago, I probably would have not even noticed that “without policy X the project is doomed” is a bad argument. A few years later (with much deliberate practice in noticing motivated cognition under my belt), and I’m capable of noticing that “this was a bad argument with the flavor of political motivation” within a few minutes. If we’re talking in person, that’s probably too long for me to catch it in time. In email or blogpost form, I can usually catch it.
And that of those who lied, many were surprised about how often they lied. I would not be surprised if this is true for many people (they lie at least once every ten minutes and would be surprised at how often they lie)
When I specifically started paying attention to little white lies (in particular, I found that I often reflexively exaggerated to make myself look good or prevent myself from looking bad) I found that I did it WAY more often than I thought. Once I got to a point where I could notice in the moment, I was able to begin correcting, but the first step was just noticing how often it occurred.
Hmm. It occurs to me that lying might be a domain that’s particularly prone to typical mind fallacy because people rarely share information about their lying habits. (See “Typical Sex Life Fallacy”)
Some categories of experiences I can recall, which I think fall on a spectrum of deliberateness to unconsciousness.
Lying while surprised.
As a teenager, my dad suddenly asked me “have you done Forbidden Activity?” at a time when I wasn’t expecting it. “Oh shit,” I thought. I said “no.”
[in this case, I was explicitly not epistemically cooperating with my dad. My understanding from Blatant Lying is the Best Kind is that this was simulacrum 2 behavior]
Rehearsing a narrative
Perhaps most similar to your experience: talking to a prospective employer at an interview, and realizing they’re about to ask me about X and the truest answer to X is pretty unflattering to myself. Rehearsing a narrative in my head to prepare for that moment, trying to come up with a story that’s true-ish enough that I can justify it to myself, so that by the time they ask me about X I can bullshit my way through it fluently.
[This seemed mostly lying playing a simulacrum 4 game with fairly established rules about what is acceptable]
Reflexing lying that’s easy to notice
If someone asks “does this dress make me look fat” and I say “no you look great!”, or someone asks “how’s your project coming along” and I say “great!”, and no she doesn’t look great and/or my project is not going great, it’s usually obvious to me almost immediately, even if I believed it (or wasn’t really paying attention one way or another) at the moment that I said “great!”.
This feels on the edge of the lying/motivated-cognition spectrum, and seems reasonable to me classify it as a lie.
Even if the first instance was unconscious, if the conversation continues about how my project is going, subsequent statements are probably deliberate S2 lies, or there is clear, continuous S2 thinking about how to maintain the “things are great!” narrative.
[where this falls on the simulacrum spectrum depends a bit on context, I could see it being level 3 or level 4]
Reflexively bad arguments
Sometimes someone says “Policy X is terrible!” and I think “no, policy X is good! Without policy X is entire project is doomed!”. And, well, I do think without Policy X, the project is going to be much harder and failure is more likely. But the statement was clearly politically motivated. “My preferred policy is absolutely essential” probably isn’t true.
A few years ago, I probably would have not even noticed that “without policy X the project is doomed” is a bad argument. A few years later (with much deliberate practice in noticing motivated cognition under my belt), and I’m capable of noticing that “this was a bad argument with the flavor of political motivation” within a few minutes. If we’re talking in person, that’s probably too long for me to catch it in time. In email or blogpost form, I can usually catch it.
[This seems like the sort of level 3 simulacrum thing that can strengthen the level 3-ness of the conversation. I don’t actually think it’s useful to think of simulacrum levels moving in a particular order, so I don’t think it’s usually accurate to say that this is moving the dial from 2 to 3, but I do think it makes it harder to get from 3 to 1]
This study found that 60% of students at UMASS lied at least once in a 10 minute conversation: https://www.eurekalert.org/pub_releases/2002-06/uoma-urf061002.php
And that of those who lied, many were surprised about how often they lied. I would not be surprised if this is true for many people (they lie at least once every ten minutes and would be surprised at how often they lie)
When I specifically started paying attention to little white lies (in particular, I found that I often reflexively exaggerated to make myself look good or prevent myself from looking bad) I found that I did it WAY more often than I thought. Once I got to a point where I could notice in the moment, I was able to begin correcting, but the first step was just noticing how often it occurred.
That link doesn’t have enough information to find the study, which is likely to contain important methodological caveats.
Here’s the study: https://sci-hub.tw/10.1207/S15324834BASP2402_8
I think the methodology is fairly OK for this sort of high level analysis, except of course for it being all university students from UMASS.