I think one way to avoid having to call this regret of rationality would be to see optimism as deceiving, not yourself, but your immune system. The fact that the human body acts differently depending on the person’s beliefs is a problem with human biology, which should be fixed. If Omega does the same thing to an AI, Omega is harming that AI, and the AI should try to make Omega stop it.
Well, deceiving somehting else by means of deceiving yourself still involves doublethink. It’s the same as saying humans should not try to be rational.
It’s saying that it may be worth sacrificing accuracy (after first knowing the truth so you know whether to deceive yourself!) in order to deceive another agent: your immune system. It’s still important to be rational in order to decide when to be irrational: all the truth still has to pass through your mind at some point in order to behave optimally.
On another note, you may benefit from reciting the Litany of Tarski:
If lying to myself can sometimes be useful, I want to believe that lying to myself can sometimes be useful.
If lying to myself cannot be useful, I want to believe that lying to myself cannot be useful.
Let me not become attached to beliefs I may not want.
I know by brain is a massively parallel neural network with only smooth fitness curves, and certainly isn’t running an outdated version of Microsoft Windows, but for how it’s behaving in response to this you couldn’t tell. I’m a sucky rationalist. :(
I think one way to avoid having to call this regret of rationality would be to see optimism as deceiving, not yourself, but your immune system. The fact that the human body acts differently depending on the person’s beliefs is a problem with human biology, which should be fixed. If Omega does the same thing to an AI, Omega is harming that AI, and the AI should try to make Omega stop it.
Well, deceiving somehting else by means of deceiving yourself still involves doublethink. It’s the same as saying humans should not try to be rational.
It’s saying that it may be worth sacrificing accuracy (after first knowing the truth so you know whether to deceive yourself!) in order to deceive another agent: your immune system. It’s still important to be rational in order to decide when to be irrational: all the truth still has to pass through your mind at some point in order to behave optimally.
On another note, you may benefit from reciting the Litany of Tarski:
If lying to myself can sometimes be useful, I want to believe that lying to myself can sometimes be useful.
If lying to myself cannot be useful, I want to believe that lying to myself cannot be useful.
Let me not become attached to beliefs I may not want.
I know by brain is a massively parallel neural network with only smooth fitness curves, and certainly isn’t running an outdated version of Microsoft Windows, but for how it’s behaving in response to this you couldn’t tell. I’m a sucky rationalist. :(