Let me see if I can pick apart your misconceptions.
Ok, I have a PhD in Applied Physics. I have learned thermo and statistical mechanics a few times including two graduate level courses. I have recently been analyzing internal and external combustion engines as part of my job, and have relearned some parts of thermo for that. It may be that despite my background, I have not done a good job of explaining what is going on with thermo. But what I am explaining here is, at worst, the way a working physicist would see thermo, informed by a science that explains a shitload of reality, and in a way which is no more subjective than the “spooky action at a distance” of electromagnetic and gravitational fields. I realize appealing to my credentials is hardly an argument. However, I am pretty sure that I am right and I am pretty sure that what I have been claiming are all within spitting distance of discussions and examples of thermo and stat mech calculations and considerations that we really talked about when I was learning this stuff.
My confidence in my position is not undermined by anything you have said, so far. I have asked you for a link to something with some kind of detail that explicates the 0 K 0 entropy boiling water position, or some version of the broken concepts you are speaking generally about. You have referred only to things already linked in the thread, or in the sequence on this topic, and i have found no links in the thread that were relevant. I have asked you again to link me to something and you haven’t.
But despite your not giving me anything to work with from your side, I have believed I understand what you are claiming. For the entropy side I would characterize it this way. Standard entropy makes a list of all states at the appropriate energy of an isolated system and say there is equal probability of the system being in any of these. And so the entropy at this energy of this isolated system is log(N(E)) where N(E) is the number of states that have energy E.
I think what you are saying is that if you have detailed knowledge of which state the system is in now, then with the details you have you can predict the exact trajectory of the system through state space, and so the number of states the system can be in is 1 because you KNOW which one it must be in. And so its entropy is 0.
A version of my response would be: so you know which state the system is at any instant of time and so you feel like the entropy is log(1) at any instant in time. But the system still evolves through time through all the enumerated states. And its entropy is log(N(E)), the count of states it evolves through, and it is unchanged that you know at each instant which state it is in. So I know the details of every collision because I follow the motions in detail, but every collision results in the system changing states as every collision changes the direction and speed of two molecules in the system, and over some short time, call it a thermalization time, the system explores nearly all N(E) states. So despite our superior knowledge that gives us the time-sequence of how the system changes from state to state and when, it still explores N(E) states, and its properties of melting ice or pushing pistons is still predictable purely from knowledge of N(E), and is not helped or hurt by a detailed knowledge of the time evolution of the system, the details of how it goes about xploring all N(E) states.
I have just reread this article on Maxwell’s Demons. I note that at no point do they deviate from the classic definitions of temperature and entropy. And indeed, the message seems to be that once the demon is part of the system, the system grows classical entropy exactly as predicted, the demons themselves are engines producing the entropy increases needed to balance all equations.
Now about the relationship between internal energy and degrees of freedom. You’re misunderstanding what a degree of freedom is.
I said rotation or movement of a rigid body. By definition a rigid body doesn’t have modes of vibration in it. Of course you may think that all real bodies are not truly rigid as they are made out of molecules. But if the macroscopic motion is only weakly coupled to the vibrational modes of the material it is made of, then this is essentially saying the macroscopic and vibrational systems are insulated from each other, and so maintain there own internal temperatures which can be different from each other. Just as two gases separated by a heat-insulating wall can be at different temperatures, a feature we find often used in thermodynamic calculations.
And finally, you cannot calculate absolute energy for an object moving linearly through space. The kinetic energy depends on the rest frame.
You actually asked me to “Forget a motor, just imagine an object at 0 K moving linearly through outer space.” And so I used the example you asked me to use.
Credentials aren’t very relevant here, but if we’re going to talk about them, I have a PhD in engineering and a BS in math (minor in physics).
and in a way which is no more subjective than the “spooky action at a distance” of electromagnetic and gravitational fields.
Again, as I’ve pointed out at least once before, entropy is not subjective. Being dependent on model and information does not mean it is subjective.
And so the entropy at this energy of this isolated system is log(N(E)) where N(E) is the number of states that have energy E.
Right off the bat, this is wrong. In a continuous system the state space could be continuous (uncountably infinite) and so N(E) makes no sense. “Logarithm of the number of states of the system” is just a loose way of describing what entropy is, not a precise way.
and so the number of states the system can be in is 1 because you KNOW which one it must be in. And so its entropy is 0.
The number of states a system can be in is always 1! A system (a classical system, at least) can never be in more than one state at a time. The ‘number of states’, insofar as it is loosely used, means the size of the state space according to our model and our information about the system.
And its entropy is log(N(E)), the count of states it evolves through, and it is unchanged that you know at each instant which state it is in.
There are several things wrong with this. First of all, it assumes the ergodic hypothesis (time average = space average) and the ergodic hypothesis is not required for thermodynamics to work (although it does make a lot of physical systems easier to analyze). But it also has another problem in that it makes entropy dependent on time scale. That is, choosing a fine time scale would decrease entropy. This is not how entropy works. And at any rate, it’s not what entropy measures anyway.
I said rotation or movement of a rigid body. By definition a rigid body doesn’t have modes of vibration in it.
But I’m not assuming a rigid body. You are. There is no reason to assume a rigid body. I offered an example of a cold flywheel turning a hot flywheel, as a system where energy moves from a cold object to a hot object. You decided for some reason that the flywheels must be rigid bodies. They aren’t, at least not in my example.
Right off the bat, this is wrong. In a continuous system the state space could be continuous (uncountably infinite) and so N(E) makes no sense. “Logarithm of the number of states of the system” is just a loose way of describing what entropy is, not a precise way.
A finite system at finite energy has a finite number of states in quantum. So if we restrict ourselves only to any kind of situation which could ever be realized by human investigators in our universe, conclusions reached using discrete states are valid.
There are several things wrong with this. First of all, it assumes the ergodic hypothesis
No, I am considering all possible states N(E) of the system at energy E. Many of these states will be highly spatially anisotropic, and I am still including them in the count.
But it also has another problem in that it makes entropy dependent on time scale. That is, choosing a fine time scale would decrease entropy. This is not how entropy works. And at any rate, it’s not what entropy measures anyway.
Since you won’t show me in any detail the calculation that leads to water having 0 temperature or 0 energy if you have special knowledge of it, I can only work from my guesses about what you are talking about. And my guess is that you achieve low entropy, 0 entropy, because with sufficient special knowledge you reduce the number of possible states to 1 at any instant, the state that the system is actually in at that instant. But if you count the number of states the system has been in as time goes by, ever time two things collide and change velocity you bounce to another state, and so even with perfect knowledge of the time evolution over a long enough time, you still cover all possible N(E) states. But over an insufficiently long time you cover a smaller number of states. In fact, the behavior of states looked at on time-scales too short to get “thermalization,” that is too short to allow the system to change through a significant fraction of the available states might possibly be describably with an entropy that depended on time, but the last thing I want to do is define new things and call them entropy when they do not have the properties of the classic entropy I have been advocating for through this entire thread.
You decided for some reason that the flywheels must be rigid bodies. They aren’t, at least not in my example.
Given the length of this thread, I think it would be better if you read all the sentences in each paragraph rather than responding to one out of context.
Seriously, can’t you give me an example of your 0 K 0 entropy boiling water and tell me what you hope to know from this example that we don’t know already? We have probably gotten most of what we can get from an open-ended discussion of philosophy of thermodynamics. A real example from you would certainly restrict the field of discussion, possibly to something even worth doing. Who knows, I might look at what you have and agree with your conclusions.
Ok, I have a PhD in Applied Physics. I have learned thermo and statistical mechanics a few times including two graduate level courses. I have recently been analyzing internal and external combustion engines as part of my job, and have relearned some parts of thermo for that. It may be that despite my background, I have not done a good job of explaining what is going on with thermo. But what I am explaining here is, at worst, the way a working physicist would see thermo, informed by a science that explains a shitload of reality, and in a way which is no more subjective than the “spooky action at a distance” of electromagnetic and gravitational fields. I realize appealing to my credentials is hardly an argument. However, I am pretty sure that I am right and I am pretty sure that what I have been claiming are all within spitting distance of discussions and examples of thermo and stat mech calculations and considerations that we really talked about when I was learning this stuff.
My confidence in my position is not undermined by anything you have said, so far. I have asked you for a link to something with some kind of detail that explicates the 0 K 0 entropy boiling water position, or some version of the broken concepts you are speaking generally about. You have referred only to things already linked in the thread, or in the sequence on this topic, and i have found no links in the thread that were relevant. I have asked you again to link me to something and you haven’t.
But despite your not giving me anything to work with from your side, I have believed I understand what you are claiming. For the entropy side I would characterize it this way. Standard entropy makes a list of all states at the appropriate energy of an isolated system and say there is equal probability of the system being in any of these. And so the entropy at this energy of this isolated system is log(N(E)) where N(E) is the number of states that have energy E.
I think what you are saying is that if you have detailed knowledge of which state the system is in now, then with the details you have you can predict the exact trajectory of the system through state space, and so the number of states the system can be in is 1 because you KNOW which one it must be in. And so its entropy is 0.
A version of my response would be: so you know which state the system is at any instant of time and so you feel like the entropy is log(1) at any instant in time. But the system still evolves through time through all the enumerated states. And its entropy is log(N(E)), the count of states it evolves through, and it is unchanged that you know at each instant which state it is in. So I know the details of every collision because I follow the motions in detail, but every collision results in the system changing states as every collision changes the direction and speed of two molecules in the system, and over some short time, call it a thermalization time, the system explores nearly all N(E) states. So despite our superior knowledge that gives us the time-sequence of how the system changes from state to state and when, it still explores N(E) states, and its properties of melting ice or pushing pistons is still predictable purely from knowledge of N(E), and is not helped or hurt by a detailed knowledge of the time evolution of the system, the details of how it goes about xploring all N(E) states.
I have just reread this article on Maxwell’s Demons. I note that at no point do they deviate from the classic definitions of temperature and entropy. And indeed, the message seems to be that once the demon is part of the system, the system grows classical entropy exactly as predicted, the demons themselves are engines producing the entropy increases needed to balance all equations.
I said rotation or movement of a rigid body. By definition a rigid body doesn’t have modes of vibration in it. Of course you may think that all real bodies are not truly rigid as they are made out of molecules. But if the macroscopic motion is only weakly coupled to the vibrational modes of the material it is made of, then this is essentially saying the macroscopic and vibrational systems are insulated from each other, and so maintain there own internal temperatures which can be different from each other. Just as two gases separated by a heat-insulating wall can be at different temperatures, a feature we find often used in thermodynamic calculations.
You actually asked me to “Forget a motor, just imagine an object at 0 K moving linearly through outer space.” And so I used the example you asked me to use.
Credentials aren’t very relevant here, but if we’re going to talk about them, I have a PhD in engineering and a BS in math (minor in physics).
Again, as I’ve pointed out at least once before, entropy is not subjective. Being dependent on model and information does not mean it is subjective.
Right off the bat, this is wrong. In a continuous system the state space could be continuous (uncountably infinite) and so N(E) makes no sense. “Logarithm of the number of states of the system” is just a loose way of describing what entropy is, not a precise way.
The number of states a system can be in is always 1! A system (a classical system, at least) can never be in more than one state at a time. The ‘number of states’, insofar as it is loosely used, means the size of the state space according to our model and our information about the system.
There are several things wrong with this. First of all, it assumes the ergodic hypothesis (time average = space average) and the ergodic hypothesis is not required for thermodynamics to work (although it does make a lot of physical systems easier to analyze). But it also has another problem in that it makes entropy dependent on time scale. That is, choosing a fine time scale would decrease entropy. This is not how entropy works. And at any rate, it’s not what entropy measures anyway.
But I’m not assuming a rigid body. You are. There is no reason to assume a rigid body. I offered an example of a cold flywheel turning a hot flywheel, as a system where energy moves from a cold object to a hot object. You decided for some reason that the flywheels must be rigid bodies. They aren’t, at least not in my example.
A finite system at finite energy has a finite number of states in quantum. So if we restrict ourselves only to any kind of situation which could ever be realized by human investigators in our universe, conclusions reached using discrete states are valid.
No, I am considering all possible states N(E) of the system at energy E. Many of these states will be highly spatially anisotropic, and I am still including them in the count.
Since you won’t show me in any detail the calculation that leads to water having 0 temperature or 0 energy if you have special knowledge of it, I can only work from my guesses about what you are talking about. And my guess is that you achieve low entropy, 0 entropy, because with sufficient special knowledge you reduce the number of possible states to 1 at any instant, the state that the system is actually in at that instant. But if you count the number of states the system has been in as time goes by, ever time two things collide and change velocity you bounce to another state, and so even with perfect knowledge of the time evolution over a long enough time, you still cover all possible N(E) states. But over an insufficiently long time you cover a smaller number of states. In fact, the behavior of states looked at on time-scales too short to get “thermalization,” that is too short to allow the system to change through a significant fraction of the available states might possibly be describably with an entropy that depended on time, but the last thing I want to do is define new things and call them entropy when they do not have the properties of the classic entropy I have been advocating for through this entire thread.
Given the length of this thread, I think it would be better if you read all the sentences in each paragraph rather than responding to one out of context.
Seriously, can’t you give me an example of your 0 K 0 entropy boiling water and tell me what you hope to know from this example that we don’t know already? We have probably gotten most of what we can get from an open-ended discussion of philosophy of thermodynamics. A real example from you would certainly restrict the field of discussion, possibly to something even worth doing. Who knows, I might look at what you have and agree with your conclusions.