Also, I still don’t buy the claim about the temperature. You said in the linked comment that putting a known-microstate cup of tea in contact with an unknown-microstate cup of tea wouldn’t really be thermal equilibrium because it would be “not using all the information at your disposal. And if you don’t use the information it’s as if you didn’t have it.”
If I know the exact state of a cup of tea, and am able to predict how that state will evolve in the future, the cup of tea has zero entropy.
Then suppose I take a glass of water that is Boltzmann-distributed. It has some spread over possible microstates—the bigger the spread, the higher entropy (And also temperature, for Boltzmann-distributed things).
Then you put the tea and the water in thermal contact. Now, for every possible microstate of the glass of water, the combined system evolves to a single final microstate (only one, because you know the exact state of the tea). The combined sytem is no longer Boltzmann in either subsytem, and has the same entropy as the original glass of water, just moved into different microstates.
Note that it didn’t matter what the water’s temperature was—all that mattered was that the tea’s distribution had zero entropy. The fact that there has been no increase in entropy is the proof that all the information has been used. If the water had the same average energy as the tea, so that no macroscopic amount of energy was exchanged, then these thing would be in thermal equilibrium by your standards.
Then you put the tea and the water in thermal contact. Now, for every possible microstate of the glass of water, the combined system evolves to a single final microstate (only one, because you know the exact state of the tea).
After you put the glass of water in contact with the cup of tea, you will quickly become uncertain about the state of the tea. In order to still know the microstate, you need to be fed more information.
If you have a Boltzmann distribution, you still know all the microstates—you just have a probability distribution over them. Time evolution in contact with a zero-entropy object moves probability from one microstate to another in a predictable way, with neither compression nor spreading of the probability distribution.
Sure, this requires obscene amounts of processing power to keep track of, but not particularly more than it took to play Maxwell’s demon with a known cup of tea.
Firstly, even if you actually had a block of ice at 0 K and put it in thermal contact with a warm glass of water, the total system entropy would increase over time. It is completely false that the number of initial and final microstates are the same. Entropy depends on volume as well as temperature. (To see why this is the case, consider that you’re dealing with a continuous phase space, not a discrete one).
Additionally, your example doesn’t apply to what I’m talking about, because nowhere are you using the information about the cup of tea. Again, as I said, if you don’t use the information it’s as if you didn’t have it.
I am fully aware that saying it in this way is clumsy and hard to understand (and not 100% convincing, even though it really is true). That’s why I’m looking for a more abstract, theoretical way of saying it.
I’m not really sure why you say volume is changing here.
I don’t understand how you want information to be used, if not to calculate a final distribution over microstates, or what you think “losing information” is if not an increase in entropy. If we’re having some sort of disconnect I’d be happy to talk more, but if you’re trolling me I would like to not be trolled.
I’m not really sure why you say volume is changing here.
Think about putting a packet of gas next to a vacuum and allowing it to expand. In this case it’s even easier to see that the requirements of your thought experiment hold—you know the exact state of the vacuum, because it has no microstates. Yet the total system entropy will still increase as the molecules of gas expand to fill the vacuum. Even if you have perfect information about the gas at the beginning (zero entropy), at the end of the experiment you will not. You will have some uncertainty. This is because the phase space itself has expanded.
If we’re having some sort of disconnect I’d be happy to talk more,
I think we are. I suggest becoming familiar with R Landauer and C H Bennet’s work. I’d be happy to discuss this further if we are on the same page.
Think about putting a packet of gas next to a vacuum and allowing it to expand. In this case it’s even easier to see that the requirements of your thought experiment hold
Oh, I see, you’re thinking of particle exchange, like if one dumped the water into the tea. This case is not what I intended—by thermal contact I just mean exchange of energy.
With identical particles, the case with particle exchange gets complicated. There might even be some interesting physics there.
The thermodynamics of energy exchange and mass exchange are actually similar. You still get the increase in entropy, even if you are just exchanging energy.
One the one hand, this is a good point that points out a weakness in my argument—if states are continuous rather than discrete, one can increase or decrease entropy even with deterministic time-evolution by spreading out or squeezing probability mass.
But I don’t know how far outside the microcanonical this analogy you’re making holds. Exchanging energy definitely works like exchanging particles when all you know is the total energy, but there’s no entropy increase when both are in a single microstate, or when both have the same Boltzmann distribution (hm, or is there?).
Nope, sorry.
Also, I still don’t buy the claim about the temperature. You said in the linked comment that putting a known-microstate cup of tea in contact with an unknown-microstate cup of tea wouldn’t really be thermal equilibrium because it would be “not using all the information at your disposal. And if you don’t use the information it’s as if you didn’t have it.”
If I know the exact state of a cup of tea, and am able to predict how that state will evolve in the future, the cup of tea has zero entropy.
Then suppose I take a glass of water that is Boltzmann-distributed. It has some spread over possible microstates—the bigger the spread, the higher entropy (And also temperature, for Boltzmann-distributed things).
Then you put the tea and the water in thermal contact. Now, for every possible microstate of the glass of water, the combined system evolves to a single final microstate (only one, because you know the exact state of the tea). The combined sytem is no longer Boltzmann in either subsytem, and has the same entropy as the original glass of water, just moved into different microstates.
Note that it didn’t matter what the water’s temperature was—all that mattered was that the tea’s distribution had zero entropy. The fact that there has been no increase in entropy is the proof that all the information has been used. If the water had the same average energy as the tea, so that no macroscopic amount of energy was exchanged, then these thing would be in thermal equilibrium by your standards.
After you put the glass of water in contact with the cup of tea, you will quickly become uncertain about the state of the tea. In order to still know the microstate, you need to be fed more information.
If you have a Boltzmann distribution, you still know all the microstates—you just have a probability distribution over them. Time evolution in contact with a zero-entropy object moves probability from one microstate to another in a predictable way, with neither compression nor spreading of the probability distribution.
Sure, this requires obscene amounts of processing power to keep track of, but not particularly more than it took to play Maxwell’s demon with a known cup of tea.
That’s wrong on both counts.
Firstly, even if you actually had a block of ice at 0 K and put it in thermal contact with a warm glass of water, the total system entropy would increase over time. It is completely false that the number of initial and final microstates are the same. Entropy depends on volume as well as temperature. (To see why this is the case, consider that you’re dealing with a continuous phase space, not a discrete one).
Additionally, your example doesn’t apply to what I’m talking about, because nowhere are you using the information about the cup of tea. Again, as I said, if you don’t use the information it’s as if you didn’t have it.
I am fully aware that saying it in this way is clumsy and hard to understand (and not 100% convincing, even though it really is true). That’s why I’m looking for a more abstract, theoretical way of saying it.
I’m not really sure why you say volume is changing here.
I don’t understand how you want information to be used, if not to calculate a final distribution over microstates, or what you think “losing information” is if not an increase in entropy. If we’re having some sort of disconnect I’d be happy to talk more, but if you’re trolling me I would like to not be trolled.
Think about putting a packet of gas next to a vacuum and allowing it to expand. In this case it’s even easier to see that the requirements of your thought experiment hold—you know the exact state of the vacuum, because it has no microstates. Yet the total system entropy will still increase as the molecules of gas expand to fill the vacuum. Even if you have perfect information about the gas at the beginning (zero entropy), at the end of the experiment you will not. You will have some uncertainty. This is because the phase space itself has expanded.
I think we are. I suggest becoming familiar with R Landauer and C H Bennet’s work. I’d be happy to discuss this further if we are on the same page.
Oh, I see, you’re thinking of particle exchange, like if one dumped the water into the tea. This case is not what I intended—by thermal contact I just mean exchange of energy.
With identical particles, the case with particle exchange gets complicated. There might even be some interesting physics there.
The thermodynamics of energy exchange and mass exchange are actually similar. You still get the increase in entropy, even if you are just exchanging energy.
One the one hand, this is a good point that points out a weakness in my argument—if states are continuous rather than discrete, one can increase or decrease entropy even with deterministic time-evolution by spreading out or squeezing probability mass.
But I don’t know how far outside the microcanonical this analogy you’re making holds. Exchanging energy definitely works like exchanging particles when all you know is the total energy, but there’s no entropy increase when both are in a single microstate, or when both have the same Boltzmann distribution (hm, or is there?).
I’ll think about it too.