It’d just model a world where if the machine it sees in the mirror turns off, it can no longer influence what happens.
When the function it uses to model the world becomes detailed enough, it can predict only being able to do certain things if some objects in the world survive, like the program running on that computer over there.
Do you have a way of tweaking the AIXI or AIXI(tl) equation so that that could be accomplished?
It’d just model a world where if the machine it sees in the mirror turns off, it can no longer influence what happens.
When the function it uses to model the world becomes detailed enough, it can predict only being able to do certain things if some objects in the world survive, like the program running on that computer over there.