If you find yourself needing to lie for your cause, what your effectively admitting is that the truth doesn’t support it.
Not necessarily. You may deal with irrational people, who will not be moved by truth. Or the inferential distances can be long, and you only have very short time to convince people before something irreversible happens—although in this case, you are creating problems in the long run.
(I generally agree with what you said. This is just an example of how this generalization is also leaky. And of course, because we run on the corrupted hardware, every situation will likely seem to be the one where the generalization does not apply.)
Not necessarily. You may deal with irrational people, who will not be moved by truth. Or the inferential distances can be long, and you only have very short time to convince people before something irreversible happens—although in this case, you are creating problems in the long run.
(I generally agree with what you said. This is just an example of how this generalization is also leaky. And of course, because we run on the corrupted hardware, every situation will likely seem to be the one where the generalization does not apply.)