First let’s set the input sequence to be very malevolent toward Omega: it will always say the reverse of what Omega is projected to say based on the previous bits.
I’m not sure I buy this, but let me make sure I understand correctly. There is some super-super-intelligence generating bits, which is smarter than Omega and also antagonistic to it. That intelligence basically simulates Omega, figures out what Omega will guess, and sends the opposite value.
First, can’t Omega just figure out that it’s being simulated, and then find a source of pure randomness, and just guess according to that?
Second, can’t Omega at least make sure it remains level with the pitiful human, by figuring out what the human will guess and copying that?
Third, how do you know there even exists a stream of bits that will make Omega lose constantly? It’s not at all obvious to me that such strings exist. It seems like you’re asserting the existence of N-bit strings that have Kolmogorov complexity substantially larger than N.
I’m suspicious of this. My understanding is that true Solomonoff induction is incomputable (because it requires ordering by Kolomogorov complexity.). Thus, you can’t just create an algorithm to predict its next move.
edit: That is, just because it is “defined mathematically” doesn’t mean we can predict its next move.
From what I understand, Omega is defined simply as a function F : L → {0,1}, where L is the set of all strings over {0,1}, and F((b_1,...,b_n)) is interpreted as its prediction of what the bit b_{n+1} will be given the initial bits b_1,...,b_n. Then you can always define the “malevolent” bit sequence simply as b_{n+1} = ~F((b_1,...,b_n)). F can be wildly uncomputable, of course, so it makes no sense to ask about the Kolmogorov complexity of these bit sequences.
The generalization where Omega also assigns probabilities rather than just making binary guesses seems obvious.
I’m not sure I buy this, but let me make sure I understand correctly. There is some super-super-intelligence generating bits, which is smarter than Omega and also antagonistic to it. That intelligence basically simulates Omega, figures out what Omega will guess, and sends the opposite value.
First, can’t Omega just figure out that it’s being simulated, and then find a source of pure randomness, and just guess according to that?
Second, can’t Omega at least make sure it remains level with the pitiful human, by figuring out what the human will guess and copying that?
Third, how do you know there even exists a stream of bits that will make Omega lose constantly? It’s not at all obvious to me that such strings exist. It seems like you’re asserting the existence of N-bit strings that have Kolmogorov complexity substantially larger than N.
My Omega uses Solomonoff induction. Its beliefs about the next bit are defined mathematically and depend only on the previous bits.
I’m suspicious of this. My understanding is that true Solomonoff induction is incomputable (because it requires ordering by Kolomogorov complexity.). Thus, you can’t just create an algorithm to predict its next move.
edit: That is, just because it is “defined mathematically” doesn’t mean we can predict its next move.
The input sequence doesn’t have to be computable, it only has to exist.
From what I understand, Omega is defined simply as a function F : L → {0,1}, where L is the set of all strings over {0,1}, and F((b_1,...,b_n)) is interpreted as its prediction of what the bit b_{n+1} will be given the initial bits b_1,...,b_n. Then you can always define the “malevolent” bit sequence simply as b_{n+1} = ~F((b_1,...,b_n)). F can be wildly uncomputable, of course, so it makes no sense to ask about the Kolmogorov complexity of these bit sequences.
The generalization where Omega also assigns probabilities rather than just making binary guesses seems obvious.
The bit sequences are finite length at each step, so they do have a Kolmogorov complexity.