“To tell the truth is a duty, but is a duty only with regard to the man who has a right to the truth.”
Kant disagrees and seems to warn that the principle of truth telling is universal; you can’t go around deciding who has a right to truth and who does not. Furthermore, he suggests that your lie could have terrible unforeseen consequences.
Lie to the Nazis who you feel “don’t deserve the truth” and then they end up treating everyone on the rest of the block like liars and sending all sorts of people to the concentration camps or outright killing them because its not worth trying to ferret out truth etc..etc..etc...
Eliezer:
When I was reading through your other article I thought the “fate of the world” part suggested that not lying should be the basis for a universalizable duty like Kant’s. The existence of a future “Fate of the world” event makes it seem like you are getting at the same unforeseen consequences point as The Big K -is this accurate?
I am concerned that deciding on who is rational enough to treat honestly is a slippery slope.
Personally, this seems like a point where you take a metaphysical stand, rationally work your way through the options within your axiomatic system and then apply your rational program to the choice at hand. I am more utilitarian than Kant, but it is not hard to ignore “proximity” and come up with a cost/benefit calculation that agrees with him.
“To tell the truth is a duty, but is a duty only with regard to the man who has a right to the truth.”
Kant disagrees and seems to warn that the principle of truth telling is universal; you can’t go around deciding who has a right to truth and who does not. Furthermore, he suggests that your lie could have terrible unforeseen consequences.
Lie to the Nazis who you feel “don’t deserve the truth” and then they end up treating everyone on the rest of the block like liars and sending all sorts of people to the concentration camps or outright killing them because its not worth trying to ferret out truth etc..etc..etc...
Eliezer:
When I was reading through your other article I thought the “fate of the world” part suggested that not lying should be the basis for a universalizable duty like Kant’s. The existence of a future “Fate of the world” event makes it seem like you are getting at the same unforeseen consequences point as The Big K -is this accurate?
I am concerned that deciding on who is rational enough to treat honestly is a slippery slope.
Personally, this seems like a point where you take a metaphysical stand, rationally work your way through the options within your axiomatic system and then apply your rational program to the choice at hand. I am more utilitarian than Kant, but it is not hard to ignore “proximity” and come up with a cost/benefit calculation that agrees with him.