It’s not really any more “unhelpful” than the statement that the number of bits of information needed to pick out a specific state of a system always increases. And that one’s just straight Shannon entropy.
Sure; the point is that we have lots of equivalent formulations of entropy and I don’t see the need to pick out one of them as the correct way of understanding it. One or another may be more intuitively appealing to particular students, or better suited to particular problems, but they’re all maps and not territories.
Given a quantum state, you can always tell me the entropy of that specific quantum state. It’s 0. If that’s the territory, then where is entropy in the territory?
There’s something subtle about what’s map and what’s territory in density matrices. I’d like to think to the territory as a pure quantum state and to maps as mixed states, but… If John thinks the electron in the centre of this room is either spin-up or spin-down but he has no idea which (i.e. he assign probability 50% to each), and John thinks the electron in the centre of this room is either spin-east or spin-west but he has no idea which, then for any possible experiment whatsoever, the two of them would assign the same probability distribution to the outcome. There’s something that puzzles me about this, but I’m not sure what that is.
So statistical mechanics was my weakest subject, and we’re well beyond my expertise. But if you’re really saying that we cannot extract any work from a system if we know its quantum state, that is highly counterintuitive to me, and suggests a missed assumption somewhere.
Helmholtz free energy (A) is basically the work you can extract (or more precisely, the free energy change between two states is the work you can extract by moving between those two states). So if A = E, where E is the energy that satisfies the Schroedinger equation, that means you can extract all the energy.
Excuse me, the thought somehow rotated 180 degrees between brain and fingers. My point from a couple of exchanges up remains: How did you come to know this quantum state? If you magically inject information into the problem you can do anything you like.
Well no, it’s not impossible, but the chance of it happening is obviously 2^-N, where N is the number of bits required to specify the state. It follows that if you have 2^N states, you will get lucky and extract useful work once; which is, of course, the same amount of useful work you would get from 2^N states anyway, whether you’d made a guess or not. Even on the ignorance model of entropy, you cannot extract anything useful from randomness!
Measurements work well if you want to know what quantum state something is in. Or alternately, you could prepare the state from scratch—we can do it with quite a few atoms now.
And I hardly think doing a measurement with low degeneracy lets you do anything. You can’t violate conservation of energy, or conservation of momentum, or conservation of angular momentum, or CPT symmetry. It’s only thermodynamics that stops necessarily applying.
Measurements work well if you want to know what quantum state something is in. Or alternately, you could prepare the state from scratch—we can do it with quite a few atoms now.
Yes, ok, but what about the state of the people doing the measurements or the preparation? You can’t have perfect information about them as well, that’s second thermo for you. You could just as well skip the step that mentions information and say that “If we had a state of zero entropy we could make it do a lot of work”. So you could, and the statement “If we had a state that we knew everything about we could make it do a lot of work” is equivalent, but I don’t see where one is more fundamental, useful, intuitive, or correct than the other. The magic insertion of information is no more helpful than a magic reduction of entropy.
Wouldn’t Gibbs free energy be more appropriate? pV should be available for work too.
I find myself slightly confused by that definition. Energy in straight quantum mechanics (or classical Newtonian mechanics) is a torsor. There is no preferred origin, and adding any constant to all the states changes the evolution not at all. It therefore must not change the extractable work. So the free energies are clearly incorrectly defined, and must instead be defined relative to the ground state. In which case, yes, you could extract all the energy above that, if you knew the precise state, and could manipulate the system finely enough.
2) Right. I clarified this two posts down: “the free energy change between two states is the work you can extract by moving between those two states.” So just like for energy, the zero point of free energy can be shifted around with no (classical) consequences, and what really matters (like what comes out of engines and stuff) is the relative free energy.
Since the way this whole nest of comments got started was whether it makes sense to identify entropy with incomplete information, I’d say my reply to you was made with loaded language :P
It’s not really any more “unhelpful” than the statement that the number of bits of information needed to pick out a specific state of a system always increases. And that one’s just straight Shannon entropy.
Sure; the point is that we have lots of equivalent formulations of entropy and I don’t see the need to pick out one of them as the correct way of understanding it. One or another may be more intuitively appealing to particular students, or better suited to particular problems, but they’re all maps and not territories.
Given a quantum state, you can always tell me the entropy of that specific quantum state. It’s 0. If that’s the territory, then where is entropy in the territory?
There’s something subtle about what’s map and what’s territory in density matrices. I’d like to think to the territory as a pure quantum state and to maps as mixed states, but… If John thinks the electron in the centre of this room is either spin-up or spin-down but he has no idea which (i.e. he assign probability 50% to each), and John thinks the electron in the centre of this room is either spin-east or spin-west but he has no idea which, then for any possible experiment whatsoever, the two of them would assign the same probability distribution to the outcome. There’s something that puzzles me about this, but I’m not sure what that is.
How much work can I extract from a system in that state? It’s often useful to keep the theoretical eyes on the thermodynamical ball.
Helmholtz free energy (A, or F, or sometimes H) = E—TS in the thermodynamic limit, right? So A = E in the case of a known quantum state.
So statistical mechanics was my weakest subject, and we’re well beyond my expertise. But if you’re really saying that we cannot extract any work from a system if we know its quantum state, that is highly counterintuitive to me, and suggests a missed assumption somewhere.
Helmholtz free energy (A) is basically the work you can extract (or more precisely, the free energy change between two states is the work you can extract by moving between those two states). So if A = E, where E is the energy that satisfies the Schroedinger equation, that means you can extract all the energy.
Sort of like Maxwell’s demon.
Excuse me, the thought somehow rotated 180 degrees between brain and fingers. My point from a couple of exchanges up remains: How did you come to know this quantum state? If you magically inject information into the problem you can do anything you like.
We guessed and got really lucky?
In other words, magic. As I said, if you’re allowed to use magic you can reduce the entropy as much as you like.
So is it impossible to guess and be lucky? Usually in this context the word “magic” would imply impossibility.
Well no, it’s not impossible, but the chance of it happening is obviously 2^-N, where N is the number of bits required to specify the state. It follows that if you have 2^N states, you will get lucky and extract useful work once; which is, of course, the same amount of useful work you would get from 2^N states anyway, whether you’d made a guess or not. Even on the ignorance model of entropy, you cannot extract anything useful from randomness!
Measurements work well if you want to know what quantum state something is in. Or alternately, you could prepare the state from scratch—we can do it with quite a few atoms now.
And I hardly think doing a measurement with low degeneracy lets you do anything. You can’t violate conservation of energy, or conservation of momentum, or conservation of angular momentum, or CPT symmetry. It’s only thermodynamics that stops necessarily applying.
Yes, ok, but what about the state of the people doing the measurements or the preparation? You can’t have perfect information about them as well, that’s second thermo for you. You could just as well skip the step that mentions information and say that “If we had a state of zero entropy we could make it do a lot of work”. So you could, and the statement “If we had a state that we knew everything about we could make it do a lot of work” is equivalent, but I don’t see where one is more fundamental, useful, intuitive, or correct than the other. The magic insertion of information is no more helpful than a magic reduction of entropy.
Wouldn’t Gibbs free energy be more appropriate? pV should be available for work too.
I find myself slightly confused by that definition. Energy in straight quantum mechanics (or classical Newtonian mechanics) is a torsor. There is no preferred origin, and adding any constant to all the states changes the evolution not at all. It therefore must not change the extractable work. So the free energies are clearly incorrectly defined, and must instead be defined relative to the ground state. In which case, yes, you could extract all the energy above that, if you knew the precise state, and could manipulate the system finely enough.
1) Meh.
2) Right. I clarified this two posts down: “the free energy change between two states is the work you can extract by moving between those two states.” So just like for energy, the zero point of free energy can be shifted around with no (classical) consequences, and what really matters (like what comes out of engines and stuff) is the relative free energy.
Only for pure states. Any system you have will be mixed.
I believe you mean “you will have incomplete information about any system you could really have.”
Operationally, it’s a distinction without a difference.
Since the way this whole nest of comments got started was whether it makes sense to identify entropy with incomplete information, I’d say my reply to you was made with loaded language :P