The Second Law of Thermodynamics causes the Kolmogorov complexity of the universe to increase over time. What you’ve actually constructed is an argument against being able to simulate the universe in full fidelity.
This is not right, K(.) is a function that applies to computable objects. It either does not apply to our Universe, or is a constant if it does (this constant would “price the temporal evolution in”).
I sincerely don’t think it works that way. Consider the usual relationship between Shannon entropy and Kolmogorov complexity: H(x) \proportional E[K(x)]. We know that the Gibbs, and thus Shannon, entropy of the universe is nondecreasing, and that thus means that the distribution over universe-states is getting more concentrated on more complex states over time. So the Kolmogorov complexity of the universe, viewed at a given instant in time but from a “god’s eye view”, is going up.
You could try to calculate the maximum possible entropy in the universe and “price that in” as a constant, but I think that dodges the point in the same way as AIXI_{tl} does by using an astronomically large “constant factor”. You’re just plain missing information if you try to simulate the universe from its birth to its death from within the universe. At some point, your simulation won’t be identical to the real universe anymore, it’ll diverge from reality because you’re not updating it with additional empirical data (or rather, because you never updated it with any empirical data).
Hmmm… is there an extension of Kolmogorov complexity defined to describe the information content of probabilistic Turing machines (which make random choices) instead of deterministic ones? I think that would better help describe what we mean by “complexity of the universe”.
What does this mean? What is the expectation taken with respect to? I can construct an example where the above is false. Let x1 be the first n bits of Chaitin’s omega, x2 be the (n+1)th, …, 2nth bits of Chaitin’s omega. Let X be a random variable which takes the value x1 with probability 0.5 and the value x2 with probability 0.5. Then E[K(X)] = 0.5 O(n) + 0.5 O(n) = O(n), but H(X) = 1.
edit: Oh, I see, this is a result on non-adversarial sample spaces, e.g. {0,1}^n, in Li and Vitanyi.
This is not and can not be true. I mean, for one the universe doesn’t have a Kolmogorov complexity*. But more importantly, a hypothesis is not penalized for having entropy increase over time as long as the increases in entropy arise from deterministic, entropy-increasing interactions specified in advance. Just as atomic theory isn’t penalized for having lots of distinct objects, thermodynamics is not penalized for having seemingly random outputs which are secretly guided by underlying physical laws.
*If you do not see why this is true, consider that there can be multiple hypothesis which would output the same state in their resulting universes. An obvious example would be one which specifies our laws of physics and another which specifies the position of every atom without compression in the form of physical law.
*If you do not see why this is true, consider that there can be multiple hypothesis which would output the same state in their resulting universes. An obvious example would be one which specifies our laws of physics and another which specifies the position of every atom without compression in the form of physical law.
This is exactly the sort of thing for which Kolmogorov complexity exists: to specify the length of the shortest hypothesis which outputs the correct result.
Just as atomic theory isn’t penalized for having lots of distinct objects
Atomic theory isn’t “penalized” because it has lots of distinct but repeated objects. It actually has very few things that don’t repeat. Atomic theory, after all, deals with masses of atoms.
The Second Law of Thermodynamics causes the Kolmogorov complexity of the universe to increase over time. What you’ve actually constructed is an argument against being able to simulate the universe in full fidelity.
Um, you appear to be trying to argue that the universe has infinite Kolmogorov complexity. Well, if it does it kind of undermines the whole “we must reject God because a godless universe has lower Kolmogorov” complexity argument.
Um, you appear to be trying to argue that the universe has infinite Kolmogorov complexity.
Not infinite, just growing over time. This just means that it’s impossible to simulate the universe with full fidelity from inside the universe, as you would need a bigger universe to do it in.
The Second Law of Thermodynamics causes the Kolmogorov complexity of the universe to increase over time. What you’ve actually constructed is an argument against being able to simulate the universe in full fidelity.
This is not right, K(.) is a function that applies to computable objects. It either does not apply to our Universe, or is a constant if it does (this constant would “price the temporal evolution in”).
I sincerely don’t think it works that way. Consider the usual relationship between Shannon entropy and Kolmogorov complexity: H(x) \proportional E[K(x)]. We know that the Gibbs, and thus Shannon, entropy of the universe is nondecreasing, and that thus means that the distribution over universe-states is getting more concentrated on more complex states over time. So the Kolmogorov complexity of the universe, viewed at a given instant in time but from a “god’s eye view”, is going up.
You could try to calculate the maximum possible entropy in the universe and “price that in” as a constant, but I think that dodges the point in the same way as AIXI_{tl} does by using an astronomically large “constant factor”. You’re just plain missing information if you try to simulate the universe from its birth to its death from within the universe. At some point, your simulation won’t be identical to the real universe anymore, it’ll diverge from reality because you’re not updating it with additional empirical data (or rather, because you never updated it with any empirical data).
Hmmm… is there an extension of Kolmogorov complexity defined to describe the information content of probabilistic Turing machines (which make random choices) instead of deterministic ones? I think that would better help describe what we mean by “complexity of the universe”.
What does this mean? What is the expectation taken with respect to? I can construct an example where the above is false. Let x1 be the first n bits of Chaitin’s omega, x2 be the (n+1)th, …, 2nth bits of Chaitin’s omega. Let X be a random variable which takes the value x1 with probability 0.5 and the value x2 with probability 0.5. Then E[K(X)] = 0.5 O(n) + 0.5 O(n) = O(n), but H(X) = 1.
edit: Oh, I see, this is a result on non-adversarial sample spaces, e.g. {0,1}^n, in Li and Vitanyi.
Yep. I should have gone and cited it, actually.
This is not and can not be true. I mean, for one the universe doesn’t have a Kolmogorov complexity*. But more importantly, a hypothesis is not penalized for having entropy increase over time as long as the increases in entropy arise from deterministic, entropy-increasing interactions specified in advance. Just as atomic theory isn’t penalized for having lots of distinct objects, thermodynamics is not penalized for having seemingly random outputs which are secretly guided by underlying physical laws.
*If you do not see why this is true, consider that there can be multiple hypothesis which would output the same state in their resulting universes. An obvious example would be one which specifies our laws of physics and another which specifies the position of every atom without compression in the form of physical law.
This is exactly the sort of thing for which Kolmogorov complexity exists: to specify the length of the shortest hypothesis which outputs the correct result.
Atomic theory isn’t “penalized” because it has lots of distinct but repeated objects. It actually has very few things that don’t repeat. Atomic theory, after all, deals with masses of atoms.
Um, you appear to be trying to argue that the universe has infinite Kolmogorov complexity. Well, if it does it kind of undermines the whole “we must reject God because a godless universe has lower Kolmogorov” complexity argument.
Not infinite, just growing over time. This just means that it’s impossible to simulate the universe with full fidelity from inside the universe, as you would need a bigger universe to do it in.