It’s actually worse than that. If you define Entropy using information theory, there is no such thing as random fluctuations decreasing Entropy. The system merely moves from one maximal Entropy state to the next, indefinitely.
If you work under the hypothesis that information is preserved, then the total entropy of the universe does not increase nor decrease.
It’s actually worse than that. If you define Entropy using information theory, there is no such thing as random fluctuations decreasing Entropy. The system merely moves from one maximal Entropy state to the next, indefinitely.
If you work under the hypothesis that information is preserved, then the total entropy of the universe does not increase nor decrease.