Off the top of my head (I Am Not A Physicist), if you tried this in real life:
Imagine the light thingy starting out in many different positions. Now imagine the track is frictionless. The light thingy will swing back and forth over the heavy thingy, its exact position and orbit depending on its starting position.
Does the light thingy roll to a halt? There must be friction. Friction generates heat. Heat is entropy. One subspace may go from light gray to dark gray, but another subspace goes from dark gray to light gray, and the total amplitude density is conserved.
Also, the heavy thingy itself will move as the light thingy moves toward it; pulling forces are symmetrical, by conservation of momentum. If the heavy thingy is made up of lots of little particles, they all end up in slightly different places, depending on where the light thingy was originally.
I think you’re confusing classical probabilities with quantum amplitudes.
The classical case is what you describe. We start with a probability distribution in which the two particles are know to be stationary but they have uniform distributions for their starting positions. Then as the system evolves they get drawn closer together and so our uncertainty about their position goes down. But their momentum depends on their initial positions and so we gain uncertainty about their momentum. Thus the total entropy is conserved. (If we wish we can then use friction to shift the entropy into the heat degrees of freedom.)
But in the quantum case the amplitudes are assigned only to the configuration space of the particles i.e. to their positions. There is no momentum space into which we can put our spare entropy. In fact it is possible for quantum amplitudes to become tighter as time passes, even without any outside interference (this doesn’t contradict the Second Law because the Second Law is about our uncertainty about the wavefunction, not the spread-out-ness of the wavefunction itself). For example there are solutions of the Schrodinger equation for a free particle where a Gaussian wavepacket evolves into one with a smaller variance (of its associated “probability distribution”).
So I think you’re literally wrong when you say:
If instead we’d started out with a big light-gray square—meaning that both particles had amplitude-factors widely spread—then the second law of thermodynamics would prohibit the combined system from developing into a tight dark-gray diagonal line.
A system has to start in a low-entropy state to develop into a state of quantum entanglement, as opposed to just a diffuse cloud of amplitude.
Because in the quantum case the light gray square isn’t representing a spread out probability distribution. We know exactly what the wavefunction is. The light gray square and the dark gray line both represent cases of total certainty! The time when entropy would come in would be if we had some Bayesian uncertainty about what the wavefunction actually was—a probability distribution on the space of amplitude assignments.
Nick and Recovering,
Off the top of my head (I Am Not A Physicist), if you tried this in real life:
Imagine the light thingy starting out in many different positions. Now imagine the track is frictionless. The light thingy will swing back and forth over the heavy thingy, its exact position and orbit depending on its starting position.
Does the light thingy roll to a halt? There must be friction. Friction generates heat. Heat is entropy. One subspace may go from light gray to dark gray, but another subspace goes from dark gray to light gray, and the total amplitude density is conserved.
Also, the heavy thingy itself will move as the light thingy moves toward it; pulling forces are symmetrical, by conservation of momentum. If the heavy thingy is made up of lots of little particles, they all end up in slightly different places, depending on where the light thingy was originally.
I think you’re confusing classical probabilities with quantum amplitudes.
The classical case is what you describe. We start with a probability distribution in which the two particles are know to be stationary but they have uniform distributions for their starting positions. Then as the system evolves they get drawn closer together and so our uncertainty about their position goes down. But their momentum depends on their initial positions and so we gain uncertainty about their momentum. Thus the total entropy is conserved. (If we wish we can then use friction to shift the entropy into the heat degrees of freedom.)
But in the quantum case the amplitudes are assigned only to the configuration space of the particles i.e. to their positions. There is no momentum space into which we can put our spare entropy. In fact it is possible for quantum amplitudes to become tighter as time passes, even without any outside interference (this doesn’t contradict the Second Law because the Second Law is about our uncertainty about the wavefunction, not the spread-out-ness of the wavefunction itself). For example there are solutions of the Schrodinger equation for a free particle where a Gaussian wavepacket evolves into one with a smaller variance (of its associated “probability distribution”).
So I think you’re literally wrong when you say:
Because in the quantum case the light gray square isn’t representing a spread out probability distribution. We know exactly what the wavefunction is. The light gray square and the dark gray line both represent cases of total certainty! The time when entropy would come in would be if we had some Bayesian uncertainty about what the wavefunction actually was—a probability distribution on the space of amplitude assignments.