The issue of defining “literal honesty” seems pretty subtle. Allowing reasonable stretching of “literal” to “my concept of what people meant by the question” is … well, reasonable, but also not very consistent with drawing a clear line separating ‘honest’ from ‘dishonest’. Another issue is that if the definition of honesty is
“Don’t say things that you believe to be literally false in a context where people will (with reasonably high probability) persistently believe that you believe them to be true.”
seems to admit lying when everyone knows you are lying. IE, someone who everyone assumes to be a liar is “literally honest” no matter what they say! I take this to suggest that our definition has to include intent, not just expectation. But how to modify the definition to avoid further trouble is unclear to me.
Given the difficulties, it seems like one who wishes to adhere to ‘literal honesty’ had better err on the side of literalness, clarifying any issues of interpretation as they arise. Being very literal in your answers to “how are you” may be awkward in an individual case, but as a pattern, it sets up expectations about the sort of replies you give to questions.
On the object level, I disagree about the usual meaning of “how are you?”—it seems to me like it is more often used as a bid to start a conversation, and the expected response is to come up with smalltalk about your day / what you’ve been up to / etc.
“Don’t say things that you believe to be literally false in a context where people will (with reasonably high probability) persistently believe that you believe them to be true”
is actually in line with the “bayesian honesty” component/formulation of the proposal. If one is known to universally lie, one’s words have no information content, and therefore don’t increase other people’s bayesian probabilities of falsy statements. However, it seems this is not a behaviour that Eliezer finds morally satisfactory. (I agree with Rob Bensinger that this formulation is more practical in daily life)
The issue of defining “literal honesty” seems pretty subtle. Allowing reasonable stretching of “literal” to “my concept of what people meant by the question” is … well, reasonable, but also not very consistent with drawing a clear line separating ‘honest’ from ‘dishonest’. Another issue is that if the definition of honesty is
seems to admit lying when everyone knows you are lying. IE, someone who everyone assumes to be a liar is “literally honest” no matter what they say! I take this to suggest that our definition has to include intent, not just expectation. But how to modify the definition to avoid further trouble is unclear to me.
Given the difficulties, it seems like one who wishes to adhere to ‘literal honesty’ had better err on the side of literalness, clarifying any issues of interpretation as they arise. Being very literal in your answers to “how are you” may be awkward in an individual case, but as a pattern, it sets up expectations about the sort of replies you give to questions.
On the object level, I disagree about the usual meaning of “how are you?”—it seems to me like it is more often used as a bid to start a conversation, and the expected response is to come up with smalltalk about your day / what you’ve been up to / etc.
I think that
is actually in line with the “bayesian honesty” component/formulation of the proposal. If one is known to universally lie, one’s words have no information content, and therefore don’t increase other people’s bayesian probabilities of falsy statements. However, it seems this is not a behaviour that Eliezer finds morally satisfactory. (I agree with Rob Bensinger that this formulation is more practical in daily life)