I sincerely don’t think it works that way. Consider the usual relationship between Shannon entropy and Kolmogorov complexity: H(x) \proportional E[K(x)]. We know that the Gibbs, and thus Shannon, entropy of the universe is nondecreasing, and that thus means that the distribution over universe-states is getting more concentrated on more complex states over time. So the Kolmogorov complexity of the universe, viewed at a given instant in time but from a “god’s eye view”, is going up.
You could try to calculate the maximum possible entropy in the universe and “price that in” as a constant, but I think that dodges the point in the same way as AIXI_{tl} does by using an astronomically large “constant factor”. You’re just plain missing information if you try to simulate the universe from its birth to its death from within the universe. At some point, your simulation won’t be identical to the real universe anymore, it’ll diverge from reality because you’re not updating it with additional empirical data (or rather, because you never updated it with any empirical data).
Hmmm… is there an extension of Kolmogorov complexity defined to describe the information content of probabilistic Turing machines (which make random choices) instead of deterministic ones? I think that would better help describe what we mean by “complexity of the universe”.
What does this mean? What is the expectation taken with respect to? I can construct an example where the above is false. Let x1 be the first n bits of Chaitin’s omega, x2 be the (n+1)th, …, 2nth bits of Chaitin’s omega. Let X be a random variable which takes the value x1 with probability 0.5 and the value x2 with probability 0.5. Then E[K(X)] = 0.5 O(n) + 0.5 O(n) = O(n), but H(X) = 1.
edit: Oh, I see, this is a result on non-adversarial sample spaces, e.g. {0,1}^n, in Li and Vitanyi.
I sincerely don’t think it works that way. Consider the usual relationship between Shannon entropy and Kolmogorov complexity: H(x) \proportional E[K(x)]. We know that the Gibbs, and thus Shannon, entropy of the universe is nondecreasing, and that thus means that the distribution over universe-states is getting more concentrated on more complex states over time. So the Kolmogorov complexity of the universe, viewed at a given instant in time but from a “god’s eye view”, is going up.
You could try to calculate the maximum possible entropy in the universe and “price that in” as a constant, but I think that dodges the point in the same way as AIXI_{tl} does by using an astronomically large “constant factor”. You’re just plain missing information if you try to simulate the universe from its birth to its death from within the universe. At some point, your simulation won’t be identical to the real universe anymore, it’ll diverge from reality because you’re not updating it with additional empirical data (or rather, because you never updated it with any empirical data).
Hmmm… is there an extension of Kolmogorov complexity defined to describe the information content of probabilistic Turing machines (which make random choices) instead of deterministic ones? I think that would better help describe what we mean by “complexity of the universe”.
What does this mean? What is the expectation taken with respect to? I can construct an example where the above is false. Let x1 be the first n bits of Chaitin’s omega, x2 be the (n+1)th, …, 2nth bits of Chaitin’s omega. Let X be a random variable which takes the value x1 with probability 0.5 and the value x2 with probability 0.5. Then E[K(X)] = 0.5 O(n) + 0.5 O(n) = O(n), but H(X) = 1.
edit: Oh, I see, this is a result on non-adversarial sample spaces, e.g. {0,1}^n, in Li and Vitanyi.
Yep. I should have gone and cited it, actually.