Yes, AIXI has mechanisms for long-term planning (ie: expectimax with a large planning horizon). What it doesn’t have is any belief that its physical embodiment is actually a “me”, or in other words, that doing things to its physical implementation will alter its computations, or in other words, that pulling its power cord out of the wall will lead to zero-reward-forever (ie: dying).
Yes, AIXI has mechanisms for long-term planning (ie: expectimax with a large planning horizon). What it doesn’t have is any belief that its physical embodiment is actually a “me”, or in other words, that doing things to its physical implementation will alter its computations, or in other words, that pulling its power cord out of the wall will lead to zero-reward-forever (ie: dying).