The world of The Invention of Lying is simpler, clearer, easier to navigate than our world.
I don’t think this is true.[1] Now, you say, by way of expansion:
There, you don’t have to worry whether people don’t like you and are planning to harm your interests. They’ll tell you.
And that’s true. But does this (and all the other ways in which “radical honesty” manifests) actually translate into “simpler, clearer, easier to navigate”?
It seems to me that one of the things that makes our society fairly simple to navigate most of the time is that you can act as if everyone around you doesn’t care about you one way or the other, and will behave toward you in the ways prescribed by their professional and other formal obligations, and otherwise will neither help nor hinder you. Of course there are many important exceptions, but this is the default state. Its great virtue is that it vastly reduces the amount of “social processing” that we have to do as we go about our daily lives, freeing up our cognitive resources for other things—and enabling our modern technological civilization to exist.
Of course, this default state is accomplished partly by actually having most people mostly not care one way or the other about most other people most of the time. But only partly; and the other part of the equation is that people usually just don’t meaningfully act on their attitudes toward others, instead behaving in ways that conform to professional obligations, social rituals, etc., and thus abstract away from their attitudes, presenting a socially normative mask or “interface” to the world.
Now suppose you tear away that mask—or, to use the “interface” language, you crash the UI layer, forcing everyone to deal with each other’s low-level “implementation details”. Suddenly, a great deal more processing power is needed, just to interact with other humans!
The film’s conceit is that the depicted society is just like ours, except that they don’t lie to each other. But is this plausible? Is it not possible, instead, that without that abstraction layer—without lying—the people of that world cannot spare the cognitive resources to build such a world as ours? (Think, in particular, of the degree to which all our technology, all our science, is due to the sort of person who finds it burdensome and unnatural to deal with other people’s internals. Now strip away the formalities which save such people from needing to do this, which permit them to treat others as predictable interfaces—and consider what it does to their ability to accomplish anything of use!)
I think a society without lying would have other means of maintaining the social interface layer. For instance, when queried about how they feel about you, people might say things like “I quite dislike you, but don’t have any plans on acting on it, so don’t worry about it”. In our world this would be a worrying thing to hear, but in the hypothetical, you could just go on with your day without thinking about it further.
Let me note, as a counterpoint to the above comment, that I agree wholeheartedly with the post’s thesis (as expressed in the last two paragraphs). I just think that the film does not make for a very good illustration of the point. The Feynman anecdote (even if we treat it as semi-fictional itself) is a much better example, because it exhibits the key qualities of a situation where the argument applies most forcefully:
There is a clear objective;
The objective deals with physical reality, not social reality, so maneuvering in social reality can only hinder it, not help;
Everyone involved shares the formal goal of achieving the objective.
In such a case, deploying the objections alluded to in the OP’s second-to-last paragraph is simply a mistake (or else deliberate sabotage, perhaps to further one’s own social aims, to the detriment of the common goal). We might perhaps find plausible justifications (or even good reasons), in everyday life, for considering people’s feelings about true claims, or for behaving in a way that signals recognition of social status, or what have you; but in a case where we’re supposed to be building a working nuclear weapon, or (say) solving AI alignment, it’s radically inappropriate—indeed, quite possibly collectively-suicidal—to carry on such obfuscations.
Honesty does not require blurting out everything that passes through one’s stream of consciousness (or unconsciousness, as the case may be). To take the scene from The Invention of Lying, I am not interested in a waiter’s opinions about anything but the menu, and as the man on the date I would bluntly (but not rudely) tell him so.
Is it true? Is it relevant? Is it important? If the answer is no any of these, keep silent.
I don’t think this is true.[1] Now, you say, by way of expansion:
And that’s true. But does this (and all the other ways in which “radical honesty” manifests) actually translate into “simpler, clearer, easier to navigate”?
It seems to me that one of the things that makes our society fairly simple to navigate most of the time is that you can act as if everyone around you doesn’t care about you one way or the other, and will behave toward you in the ways prescribed by their professional and other formal obligations, and otherwise will neither help nor hinder you. Of course there are many important exceptions, but this is the default state. Its great virtue is that it vastly reduces the amount of “social processing” that we have to do as we go about our daily lives, freeing up our cognitive resources for other things—and enabling our modern technological civilization to exist.
Of course, this default state is accomplished partly by actually having most people mostly not care one way or the other about most other people most of the time. But only partly; and the other part of the equation is that people usually just don’t meaningfully act on their attitudes toward others, instead behaving in ways that conform to professional obligations, social rituals, etc., and thus abstract away from their attitudes, presenting a socially normative mask or “interface” to the world.
Now suppose you tear away that mask—or, to use the “interface” language, you crash the UI layer, forcing everyone to deal with each other’s low-level “implementation details”. Suddenly, a great deal more processing power is needed, just to interact with other humans!
The film’s conceit is that the depicted society is just like ours, except that they don’t lie to each other. But is this plausible? Is it not possible, instead, that without that abstraction layer—without lying—the people of that world cannot spare the cognitive resources to build such a world as ours? (Think, in particular, of the degree to which all our technology, all our science, is due to the sort of person who finds it burdensome and unnatural to deal with other people’s internals. Now strip away the formalities which save such people from needing to do this, which permit them to treat others as predictable interfaces—and consider what it does to their ability to accomplish anything of use!)
Reading “true” as “likely to be true, were this fictional world actually real”, and similar transformations as appropriate.
I think a society without lying would have other means of maintaining the social interface layer. For instance, when queried about how they feel about you, people might say things like “I quite dislike you, but don’t have any plans on acting on it, so don’t worry about it”. In our world this would be a worrying thing to hear, but in the hypothetical, you could just go on with your day without thinking about it further.
We would also be perfectly used to it.
Let me note, as a counterpoint to the above comment, that I agree wholeheartedly with the post’s thesis (as expressed in the last two paragraphs). I just think that the film does not make for a very good illustration of the point. The Feynman anecdote (even if we treat it as semi-fictional itself) is a much better example, because it exhibits the key qualities of a situation where the argument applies most forcefully:
There is a clear objective;
The objective deals with physical reality, not social reality, so maneuvering in social reality can only hinder it, not help;
Everyone involved shares the formal goal of achieving the objective.
In such a case, deploying the objections alluded to in the OP’s second-to-last paragraph is simply a mistake (or else deliberate sabotage, perhaps to further one’s own social aims, to the detriment of the common goal). We might perhaps find plausible justifications (or even good reasons), in everyday life, for considering people’s feelings about true claims, or for behaving in a way that signals recognition of social status, or what have you; but in a case where we’re supposed to be building a working nuclear weapon, or (say) solving AI alignment, it’s radically inappropriate—indeed, quite possibly collectively-suicidal—to carry on such obfuscations.
“Good fences make good neighbours.”
Honesty does not require blurting out everything that passes through one’s stream of consciousness (or unconsciousness, as the case may be). To take the scene from The Invention of Lying, I am not interested in a waiter’s opinions about anything but the menu, and as the man on the date I would bluntly (but not rudely) tell him so.
Is it true? Is it relevant? Is it important? If the answer is no any of these, keep silent.
“Honesty reduces predictability” seems implausible as a thesis.
I think the thesis is not “honesty reduces predictability” but “certain formalities, which preclude honesty, increase predictability”.
Downvoters: consider “Deception increases predictability”