The proof that I’ll let the AI out is not something that’s passively “plausible” or “implausible”, it’s something I control. I can make it wrong.
Do you say that to time-travelers and prophets too? ,:-.
One might want to perform the action that’s the opposite of what any correct formal proof given to you claims the action to be. As a result of having the property of behaving this way, you’ll never get confronted with the confusing formally correct claims about your future decisions.
In other words, your actions are free even of the limitations of formally correct proofs, in the sense that if your actions oppose such proofs, the proofs become impossible (you make the actions intractable by construction).
Yes, in every case where I meet one.
Do you say that to time-travelers and prophets too? ,:-.
One might want to perform the action that’s the opposite of what any correct formal proof given to you claims the action to be. As a result of having the property of behaving this way, you’ll never get confronted with the confusing formally correct claims about your future decisions.
In other words, your actions are free even of the limitations of formally correct proofs, in the sense that if your actions oppose such proofs, the proofs become impossible (you make the actions intractable by construction).
Yes, in every case where I meet one.