Suppose a program p is playing chess (or go or checkers*) against someone or something, and models ‘it’ as a program, and tries to simulate what they will do in response to some moves p is considering.
Then it would think it was in a universe much simpler than ours, and to convince it otherwise we would have to give it a number of bits ~ the difference in complexities.
Suppose a program p is playing chess (or go or checkers*) against someone or something, and models ‘it’ as a program, and tries to simulate what they will do in response to some moves p is considering.
*Solved under at least one rule set.
Then it would think it was in a universe much simpler than ours, and to convince it otherwise we would have to give it a number of bits ~ the difference in complexities.
What? Pretty sure chess AIs, aren’t that complex today. They handle a simple world.