Omega isn’t assigned the status of Liar until it actually does something.
Simulating somebody is doing something, especially from the point of view of the simulated. (Note that in Cyan’s thought experiment she has a consciousness and all.)
We postulated that Omega never lies. The simulated consciousness hears a lie. Now, as far as I can see, you have two major ways out of the contradiction. The first is that it is not Omega that does this lying, but simulated-Omega. The second is that lying to a simulated consciousness does not count as lying, at least not in the real world.
The first is perfectly viable, but it highlights what for me was the main take-home message from Cyan’s thought experiment: That “Omega never lies.” is harder to formalize than it appears.
The second is also perfectly viable, but it will be extremely unpopular here at LW.
Simulating somebody is doing something, especially from the point of view of the simulated. (Note that in Cyan’s thought experiment she has a consciousness and all.)
We postulated that Omega never lies. The simulated consciousness hears a lie. Now, as far as I can see, you have two major ways out of the contradiction. The first is that it is not Omega that does this lying, but simulated-Omega. The second is that lying to a simulated consciousness does not count as lying, at least not in the real world.
The first is perfectly viable, but it highlights what for me was the main take-home message from Cyan’s thought experiment: That “Omega never lies.” is harder to formalize than it appears.
The second is also perfectly viable, but it will be extremely unpopular here at LW.
Perhaps I am not fully understanding what you mean by simulation. If I create a simulation, what does this mean?
In this context, something along the lines of whole brain emulation.