Maybe Omega recognizes in advance that you might think this way, doesn’t want it to happen, and so precommits to asking the real you. With the existence of this precommitment, you may not properly make this reasoning. Moreover, you should be able to figure out that Omega would precommit, thus making it unnecessary for him to explicitlyy tell you he’s doing so.
Maybe Omega [...] doesn’t want it to happen [...] Moreover, you should be able to figure out that Omega would precommit
(Emphasis mine.)
I don’t think, given the usual problem formulation, that one can figure out what Omega wants without Omega explicitly saying it, and maybe not even in that case.
It’s a bit like a deal with a not-necessarily-evil devil. Even if it tells you something and you’re sure it’s not lying and you think you the wording is perfectly clear, you should still assign a very high probability that you have no idea what’s really going on and why.
Maybe Omega recognizes in advance that you might think this way, doesn’t want it to happen, and so precommits to asking the real you. With the existence of this precommitment, you may not properly make this reasoning. Moreover, you should be able to figure out that Omega would precommit, thus making it unnecessary for him to explicitlyy tell you he’s doing so.
(Emphasis mine.)
I don’t think, given the usual problem formulation, that one can figure out what Omega wants without Omega explicitly saying it, and maybe not even in that case.
It’s a bit like a deal with a not-necessarily-evil devil. Even if it tells you something and you’re sure it’s not lying and you think you the wording is perfectly clear, you should still assign a very high probability that you have no idea what’s really going on and why.