You missed a crucial point of the post, which is that when the AI does a simulation to consider the consequences of some action that the AI normally wouldn’t do, observing that action is itself a clue that SimDave is being simulated. Here’s the relevant part from the OP:
So Dave has just asked PAL to get him a cup of coffee. Dave is used to seeing PAL take route A to the coffee machine, and is initially puzzled because PAL is driving along route B. But then Dave has an epiphany. Dave knows with very high certainty that no PAL computer has ever made a mistake, so he can conclude with equally high certainty that he is no longer Dave. He is [Dave], a simulated version of Dave created inside PAL while it is computing the utility of taking route B.
You missed a crucial point of the post, which is that when the AI does a simulation to consider the consequences of some action that the AI normally wouldn’t do, observing that action is itself a clue that SimDave is being simulated. Here’s the relevant part from the OP: