That’s an interesting case because, if the Nazi is well-informed about my goals, he will probably be aware that I’d lie to him for things short of the end of the world and he could easily suspect that I’m falsely informing him of this risk in order to get him not to blow up people I’d prefer to leave intact. If all he knows about my goals is that I don’t want the world to end, whether he heeds my warnings depends on his uninformed guess about the rest of my beliefs, which could fall either way.
That’s why I think that if, say, a scientist were tempted by the Noble Lie “this bomb would actually destroy the whole earth, we cannot work on it any further,” this would be a terrible decision. By the same logic that says I hand Omega $100 so that counterfactual me gets $10000, I should not attempt to lie about such a risk so that counterfactual me can be believed where the risk actually exists
That’s an interesting case because, if the Nazi is well-informed about my goals, he will probably be aware that I’d lie to him for things short of the end of the world and he could easily suspect that I’m falsely informing him of this risk in order to get him not to blow up people I’d prefer to leave intact. If all he knows about my goals is that I don’t want the world to end, whether he heeds my warnings depends on his uninformed guess about the rest of my beliefs, which could fall either way.
That’s why I think that if, say, a scientist were tempted by the Noble Lie “this bomb would actually destroy the whole earth, we cannot work on it any further,” this would be a terrible decision. By the same logic that says I hand Omega $100 so that counterfactual me gets $10000, I should not attempt to lie about such a risk so that counterfactual me can be believed where the risk actually exists