It eventually learns that the simplest explanation for its experiences is the description of an external lawful universe in which its sense organs are embedded and a description of that embedding.
That’s the simplest explanation for our experiences. It may or may not be the simplest explanation for the experiences of an arbitrary sentient thinker.
Rather than supposing that the probability of a certain universe depends on the complexity of that universe, it takes as a primitive object a probability distribution over possible experiences.
By the same reasoning that led a normal Solomonoff inductor to accept the existence of an external universe as the best explanation for its experiences, the least complex description of your conscious experience is the description of an external lawful universe and directions for finding the substructure embodying your experience within that substructure.
Unless I’m misunderstanding you, you’re saying that we should start with an arbitrary prior (which may or may not be the same as Solomonoff’s universal prior). If you’re starting with an arbitrary prior, you have no idea what the best explanation for your experiences is going to be, because it depends on the prior. According to some prior, it’s a Giant lookup table. According to some prior, you’re being emulated by a supercomputer in a universe whose physics is being emulated at the elementary particle level by hand calculations performed by an immortal sentient being (with an odd utility function), who lives in an external lawful universe.
Of course, the same will be true if you take the standard universal prior, but define Kolmogorov complexity relative to a sufficiently bizarre universal Turing machine (of which there are many). According to the theory, it doesn’t matter because over time you will predict your experiences with greater and greater accuracy. But you never update the relative credences you give to different models which make the same predictions, so if you started off thinking that the simulation of the simulation of the simulation was a better model than simply discarding the outer layers and taking the innermost level, you will forever hold the unfalsifiable belief that you live in an inescapable Matrix, even as you use your knowledge to correctly model reality and use your model to maximize your personal utility function (or whatever it is Solomonoff inductors are supposed to do).
That’s the simplest explanation for our experiences. It may or may not be the simplest explanation for the experiences of an arbitrary sentient thinker.
Unless I’m misunderstanding you, you’re saying that we should start with an arbitrary prior (which may or may not be the same as Solomonoff’s universal prior). If you’re starting with an arbitrary prior, you have no idea what the best explanation for your experiences is going to be, because it depends on the prior. According to some prior, it’s a Giant lookup table. According to some prior, you’re being emulated by a supercomputer in a universe whose physics is being emulated at the elementary particle level by hand calculations performed by an immortal sentient being (with an odd utility function), who lives in an external lawful universe.
Of course, the same will be true if you take the standard universal prior, but define Kolmogorov complexity relative to a sufficiently bizarre universal Turing machine (of which there are many). According to the theory, it doesn’t matter because over time you will predict your experiences with greater and greater accuracy. But you never update the relative credences you give to different models which make the same predictions, so if you started off thinking that the simulation of the simulation of the simulation was a better model than simply discarding the outer layers and taking the innermost level, you will forever hold the unfalsifiable belief that you live in an inescapable Matrix, even as you use your knowledge to correctly model reality and use your model to maximize your personal utility function (or whatever it is Solomonoff inductors are supposed to do).