It seems that you could design such games by taking any of the games we are discussing and modifying it so the agent is allowed to ask an oracle questions and get answers before making its decision. Predicting the results of such games seems harder.
Actually, I think a human can easily win such games. Consider game 1 with Chaitin’s Ω as the string to be predicted. A human can use the halting oracle to compute Ω exactly, whereas the Solomonoff prior treats it like a random string and would converge to answering .5 for the probability of the next bit being 1.
But this isn’t completely convincing (for the case that Solomonoff isn’t sufficient) because perhaps an AI programmed with the Solomonoff prior can do better (or the human worse) if the halting oracle is a natural part of the environment, instead of something they are given special access to.
whereas the Solomonoff prior treats it like a random string and would converge to answering .5 for the probability of the next bit being 1.
Isn’t something like “If you have access to a halting oracle, ask it for Chaitin’s Ω” among the computable generators included in the Solomonoff prior? If so, the Solomonoff agent should converge to the correct sequence.
It seems that you could design such games by taking any of the games we are discussing and modifying it so the agent is allowed to ask an oracle questions and get answers before making its decision. Predicting the results of such games seems harder.
Actually, I think a human can easily win such games. Consider game 1 with Chaitin’s Ω as the string to be predicted. A human can use the halting oracle to compute Ω exactly, whereas the Solomonoff prior treats it like a random string and would converge to answering .5 for the probability of the next bit being 1.
But this isn’t completely convincing (for the case that Solomonoff isn’t sufficient) because perhaps an AI programmed with the Solomonoff prior can do better (or the human worse) if the halting oracle is a natural part of the environment, instead of something they are given special access to.
Isn’t something like “If you have access to a halting oracle, ask it for Chaitin’s Ω” among the computable generators included in the Solomonoff prior? If so, the Solomonoff agent should converge to the correct sequence.
No, at least not in the usual formulation of the Solomonoff prior that I’m familiar with.