We define the direction of time by the increase in Entropy.
“If we wait long enough” rhetorically hides the scope involved. To have merely 10 Helium molecules congregate in the center 1% of a sealed sphere, assuming it takes them 1⁄10 of a second to traverse that volume, will require 10^99 seconds. Which is 10^82 times the current age of the universe. Or that it will happen approximately 10 times before the heat death of the universe.
If we want to “define the direction of time by the increase in Entropy” then we have a problem in a universe where entropy is not monotonic, the definition doens’t work
The “age of the universe” could be not really the age of the universe but the time after the last entropy minimum reached in a never-ending sequence of fluctuations
Let me put it another way:
If Entropy was any sort of a random walk, there would be examples of spontaneous Entropy decrease in small environments. Since we have 150 years of experimental observation at a variety fo scales and domains and have literally zero examples of spontaneous Entropy decrease, that puts the hypothesis “Entropy behaves like a random walk” on par with “the sum of the square root of any two sides of an Icossoles triangle is equal to the square root of the remaining side”—it’s a test example for “how close to 0 probability can you actually get”
Depending on how you define it, arguably there are observation of entropy decreases as small scales (if you are willing to define the “entropy” for a system made of two atoms, for example).
At macroscopic scale (10^23 molecules), it is as unlikely as a miracle.
No, it really isn’t. What definition of Entropy are you using?
For information theory of Entropy:
Consider the simplest possible system: a magical cube box made of forcefields that never deform or interacts with it’s contents in any way. Inside is a single Neutron with a known (as close as Heisenberg uncertainty lets us) position and momentum. When that Neutron reaches the side of the box, by definition (since the box is completely unyielding and unreactive) the Neutron will deform, translating it’s Kinetic Energy into some sort of internal potential energy, and then rebound. However, we become more uncertain of the position and momentum of the Neutron after each bounce—a unidirectional increase in Entropy. There is no mechanism (without injecting additional energy into the system—ie making another observation) by which the system can become more orderly.
For Temperature theory of Entropy:
Consider the gas box with 3 particles you mentioned. If all 3 particles are moving at the same speed (recall, average particle speed is the definition of temperature for a gas), the system is at maximum possible Entropy. What mechanism exists to cause the particles to vary in speed (given the magical non-deforming non-reactive box we are containing things in)?
“What mechanism exists to cause the particles to vary in speed (given the magical non-deforming non-reactive box we are containing things in)?”
The system is a compact deterministc dynamical system and Poincarè recurrence applies: it will return infinitely many times close to any low entropic state it was before. Since the particles are only 3 the time needed for the return is small.
Having read the definitions, I’m pretty sure that a system with Poincare recurrence does not have a meaningful Entropy value (or E is absolutely constant with time). You cannot get any useful work out of the system, or within the system. The speed distribution of particles never changes. Our ability to predict the position and momentum of the particles is constant. Is there some other definition of Entropy that actually shows fluctuations like you’re describing?
The Poincarè recurrence theorem doesn’t imply that. It doesn’t imply the system is ergodic, and it only applies to “almost all” states (the exceptions are guaranteed to have measure zero, but then again, so is the set of all numbers anyone will ever specifically think about). In any case, the entropy doesn’t change at all because it’s a property of an abstraction.
An ideal gas in a box is an egodic system. The Poincarè recurrence theorem states that a volume preserving dynamical system (i.e. any conservative system in classical physics) returns infinitely often in any neighbourhood (as small as you want) of any point of the phase space.
Wikipedia has a good summary of what Entropy as used in the 2nd law actually is: https://en.m.wikipedia.org/wiki/Entropy
The key points are:
We define the direction of time by the increase in Entropy.
“If we wait long enough” rhetorically hides the scope involved. To have merely 10 Helium molecules congregate in the center 1% of a sealed sphere, assuming it takes them 1⁄10 of a second to traverse that volume, will require 10^99 seconds. Which is 10^82 times the current age of the universe. Or that it will happen approximately 10 times before the heat death of the universe.
If we want to “define the direction of time by the increase in Entropy” then we have a problem in a universe where entropy is not monotonic, the definition doens’t work
The “age of the universe” could be not really the age of the universe but the time after the last entropy minimum reached in a never-ending sequence of fluctuations
Let me put it another way: If Entropy was any sort of a random walk, there would be examples of spontaneous Entropy decrease in small environments. Since we have 150 years of experimental observation at a variety fo scales and domains and have literally zero examples of spontaneous Entropy decrease, that puts the hypothesis “Entropy behaves like a random walk” on par with “the sum of the square root of any two sides of an Icossoles triangle is equal to the square root of the remaining side”—it’s a test example for “how close to 0 probability can you actually get”
Depending on how you define it, arguably there are observation of entropy decreases as small scales (if you are willing to define the “entropy” for a system made of two atoms, for example).
At macroscopic scale (10^23 molecules), it is as unlikely as a miracle.
You do have spontatenous entropy decreases in very “small” environment. For gas in a box with 3 particles entropy is fluctuating in human-scale times.
No, it really isn’t. What definition of Entropy are you using?
For information theory of Entropy:
Consider the simplest possible system: a magical cube box made of forcefields that never deform or interacts with it’s contents in any way. Inside is a single Neutron with a known (as close as Heisenberg uncertainty lets us) position and momentum. When that Neutron reaches the side of the box, by definition (since the box is completely unyielding and unreactive) the Neutron will deform, translating it’s Kinetic Energy into some sort of internal potential energy, and then rebound. However, we become more uncertain of the position and momentum of the Neutron after each bounce—a unidirectional increase in Entropy. There is no mechanism (without injecting additional energy into the system—ie making another observation) by which the system can become more orderly.
For Temperature theory of Entropy:
Consider the gas box with 3 particles you mentioned. If all 3 particles are moving at the same speed (recall, average particle speed is the definition of temperature for a gas), the system is at maximum possible Entropy. What mechanism exists to cause the particles to vary in speed (given the magical non-deforming non-reactive box we are containing things in)?
“What mechanism exists to cause the particles to vary in speed (given the magical non-deforming non-reactive box we are containing things in)?”
The system is a compact deterministc dynamical system and Poincarè recurrence applies: it will return infinitely many times close to any low entropic state it was before. Since the particles are only 3 the time needed for the return is small.
Having read the definitions, I’m pretty sure that a system with Poincare recurrence does not have a meaningful Entropy value (or E is absolutely constant with time). You cannot get any useful work out of the system, or within the system. The speed distribution of particles never changes. Our ability to predict the position and momentum of the particles is constant. Is there some other definition of Entropy that actually shows fluctuations like you’re describing?
The ideal gas does have a mathematical definition of entropy, Boltzmann used it in the statistical derivation of the second law:
https://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics)
Here is an account of Boltzmann work and the first objections to his conclusions:
https://plato.stanford.edu/entries/statphys-Boltzmann/
The Poincarè recurrence theorem doesn’t imply that. It doesn’t imply the system is ergodic, and it only applies to “almost all” states (the exceptions are guaranteed to have measure zero, but then again, so is the set of all numbers anyone will ever specifically think about). In any case, the entropy doesn’t change at all because it’s a property of an abstraction.
An ideal gas in a box is an egodic system. The Poincarè recurrence theorem states that a volume preserving dynamical system (i.e. any conservative system in classical physics) returns infinitely often in any neighbourhood (as small as you want) of any point of the phase space.