That just means the AI cares about a particular class of decision theories rather than a specific one like Dick Kick’em. I could re-run the same thought experiment but instead Dick Kick’em says:
”I am going to read you mind and if you believe in a decision theory that one-boxes in Newcomb’s Paradox I will leave you alone, but if you believe in any other decision theory I will kick you in the dick”
In this variation, Dick Kick’em would be judging the agent based on the exact same criterea that the AI in Newcomb’s problem is using. All I have done is remove the game afterwards but that is somewhat irrelevant since the AI doesn’t judge you on your actions, just what you would do if you were in a Newcomb-type scenario.
True beliefs doesn’t mean omniscience. It is possible to have only true beliefs but still not know everything. In this case, the agent might not know if the driver can read minds but still have accurate beliefs otherwise.