Any time you attempt to implement AIXI (or any approximation) in the real world you must specify the reward mechanism. If AIXI is equipped with a robotic body you could choose for the sensors to provide “pain” signals. There is no need to provide a nebulous definition of what is or is not part of AIXI’s body in order to achieve this.
Ah, that makes sense! AIXI can receive pain signals long before it knows what they “mean”, and as its model of the world improves, it learns to avoid pain.
Any time you attempt to implement AIXI (or any approximation) in the real world you must specify the reward mechanism. If AIXI is equipped with a robotic body you could choose for the sensors to provide “pain” signals. There is no need to provide a nebulous definition of what is or is not part of AIXI’s body in order to achieve this.
Ah, that makes sense! AIXI can receive pain signals long before it knows what they “mean”, and as its model of the world improves, it learns to avoid pain.