“Oh what a tangled web we weave, when first we practise to deceive,” said Sir Walter Scott. Not all lies spin out of control—we don’t live in so righteous a universe. But it does occasionally happen, that someone lies about a fact, and then has to lie about an entangled fact, and then another fact entangled with that one:
“Where were you?”
“Oh, I was on a business trip.”
“What was the business trip about?”
“I can’t tell you that; it’s proprietary negotiations with a major client.”
“Oh—they’re letting you in on those? Good news! I should call your boss to thank him for adding you.”
“Sorry—he’s not in the office right now...”
Counter-argument:
The truth can just as easily “spin out of control” as a lie, if people are sufficiently powerful to create the appearance of a lie. It may sound absurd for a boss to go out of their way to cause an employee to appear to be lying to their spouse, but it does happen, and frighteningly regularly. Humans are masters of perception-manipulation for social gain; it’s been part of the evolutionary landscape we developed in for at least O(million years), and is theorized as one of the reasons for our big brains. A sufficiently constructed lie will make all truth-speakers that disagree with it sound like liars. The assertion that the probability for the truth to spin out of control is greater than the probability for any given lie to spin out of control in any given situation is amenable to evidence—is there some way that we could categorize situations, and then examine their tendencies to spin out of control when told the truth vs. told a lie, such that more specifically accurate theories could be developed?
My own meager evidence has suggested that the truth is more likely to spin out of control than a lie when the truth conflicts with a sufficiently-prepared lie told by a social superior, for example.
Counter-argument:
The truth can just as easily “spin out of control” as a lie, if people are sufficiently powerful to create the appearance of a lie. It may sound absurd for a boss to go out of their way to cause an employee to appear to be lying to their spouse, but it does happen, and frighteningly regularly. Humans are masters of perception-manipulation for social gain; it’s been part of the evolutionary landscape we developed in for at least O(million years), and is theorized as one of the reasons for our big brains. A sufficiently constructed lie will make all truth-speakers that disagree with it sound like liars. The assertion that the probability for the truth to spin out of control is greater than the probability for any given lie to spin out of control in any given situation is amenable to evidence—is there some way that we could categorize situations, and then examine their tendencies to spin out of control when told the truth vs. told a lie, such that more specifically accurate theories could be developed?
My own meager evidence has suggested that the truth is more likely to spin out of control than a lie when the truth conflicts with a sufficiently-prepared lie told by a social superior, for example.