Maybe there is some true randomness in the universe
Not a problem.
I know it’s not a problem. I explained exactly how to modify Solomonoff induction to handle universes that are generated randomly according to some computable law, as opposed to being generated deterministically according to an algorithm.
Suppose you flip a quantum coin ten times. If you record the output, the K-complexity is ten bits.
Maybe it is, maybe it isn’t. Maybe your definition of Kolmogorov complexity is such that the Kolmogorov complexity of every string is at least 3^^^3, because you defined Kolmogorov complexity using a really dumb universal machine. Maybe it’s 2, because you programmed a universal machine that was really good at compressing the string 0101110010, because you like it, and it so happens that was what you flipped. If you flip a coin a very large number of times, it’s very likely that the Kolmogorov complexity of the output is at least the number of flips, but it could be much smaller due to chance.
As such, there’s a 1/1024 prior probability of getting that exact output. This is exactly what you’d get if you assumed it was random.
False, because you don’t actually know the complexity was 10.
Basically, K-complexity is treating the very laws of physics as random. Any randomness on top of that works the same way as it would as part of that.
No, it’s not doing that at all. Using a complexity-weighted prior is assuming that the universe follows some random computable law, which is more likely to be simple than complex. I suppose this is true to the extent that any probability distribution on a countable set vanishes in the limit (for any enumeration of the countable set!), but I see no reason to bring Kolmogorov complexity into it.
The problem is that it seems that things that are definable but not computable should be above random chance. For example, the K-complexity of a halting oracle is infinite, but it can be defined in finite space. Would the probability of the fine structure constant being a halting oracle be infinitesimal?
I have no idea how to make sense of this question. Are infinitely many bits of fine structure constant even physically meaningful? If beyond a certain point of precision, the fine structure constant can’t be measured even in principle because it has no detectable physical effects on accessible time/space scales, it makes no sense to even have an answer to the question “is the fine structure constant a halting oracle?” let alone the probability of such an event.
One reason to use K-complexity is that so far, it’s worked far better than anything else. As far as we know, we can fit the laws of physics on a note card, yet the universe contains well over 10^80 particles, and don’t get me started on the amount of computing power necessary to run it.
I’ll grant you that Occam’s razor works well in science. That wasn’t the question. The question is what is the advantage, if any, of starting with a Kolmogorov prior for Solomonoff induction, as opposed to any other arbitrary prior. You haven’t answered my question.
I know it’s not a problem. I explained exactly how to modify Solomonoff induction to handle universes that are generated randomly according to some computable law, as opposed to being generated deterministically according to an algorithm.
Maybe it is, maybe it isn’t. Maybe your definition of Kolmogorov complexity is such that the Kolmogorov complexity of every string is at least 3^^^3, because you defined Kolmogorov complexity using a really dumb universal machine. Maybe it’s 2, because you programmed a universal machine that was really good at compressing the string 0101110010, because you like it, and it so happens that was what you flipped. If you flip a coin a very large number of times, it’s very likely that the Kolmogorov complexity of the output is at least the number of flips, but it could be much smaller due to chance.
False, because you don’t actually know the complexity was 10.
No, it’s not doing that at all. Using a complexity-weighted prior is assuming that the universe follows some random computable law, which is more likely to be simple than complex. I suppose this is true to the extent that any probability distribution on a countable set vanishes in the limit (for any enumeration of the countable set!), but I see no reason to bring Kolmogorov complexity into it.
I have no idea how to make sense of this question. Are infinitely many bits of fine structure constant even physically meaningful? If beyond a certain point of precision, the fine structure constant can’t be measured even in principle because it has no detectable physical effects on accessible time/space scales, it makes no sense to even have an answer to the question “is the fine structure constant a halting oracle?” let alone the probability of such an event.
I’ll grant you that Occam’s razor works well in science. That wasn’t the question. The question is what is the advantage, if any, of starting with a Kolmogorov prior for Solomonoff induction, as opposed to any other arbitrary prior. You haven’t answered my question.