There may be interesting or even amusing information theoretic senses in which tracking the microscopic details can be said to have zero entropy, but these do not impact the physics of the system.
Yes this is the entire point; entropy seems to be disassociated from what’s “out there.”
As to whether or not a 0 K ice cube will melt, it will melt if your put it in contact with a gas that has enough kinetic energy in it such that when that energy is divided by all the molecules in the system, the average energy per molecule is greater than k*274 K where k is boltzmann’s constant.
And no one has said otherwise. But if you consider gas+information together, you can no longer consistently say it’s at anything other than 0 K.
There is nothing arbitrary or “in the mind’ about thermodynamics, it is a simplified map of a large range of real situations,
I think you’re misunderstanding what “in the mind” means. It does not mean that our thoughts can influence physics. Rather, it means that quantities like entropy and temperature depend (to you) on the physical model in which you’re viewing the system.
I think you’re misunderstanding what “in the mind” means. It does not mean that our thoughts can influence physics. Rather, it means that quantities like entropy and temperature depend (to you) on the physical model in which you’re viewing the system.
I don’t think I am misunderstanding anything. But it is possible that i am merely not misunderstanding the physics, I suppose. But I participated in the other threads and I am pretty sure I know what we are talking about.
To the extent that you want to define something that allows you to characterize a boiling pot of water as having either zero entropy or zero temperature, define away. I will “merely” point out that the words entropy and temperature have already been applied to that situation by others who have come before you and in a way which is not altered by any knowledge you may have beyond the extensive quantities of the boiling pot of water.
I will point out that your quantities of “entropy” and “temperature” break the laws of thermodynamics in probably every respect. In your system, energy can flow from a colder object to a hotter object. In your system, entropy can decrease in a closed system. In summary, not only are your definitions of entropy and temperature confusing a rather difficult but unconfused subject, but they are also violating all the relationships that people versed in thermodynamics carry around about entropy and temperature.
So what is the possible point of calling your newly defined quantities entropy and temperature? It seems to me the only point is to piggyback your relatively useless concepts on the well-deserved reputation of entropy and temperature in order to get them an attention they do not deserve.
No matter how much information I have about a pot of boiling water, it is still capable of turning a turbine with its steam, cooking rice, and melting ice cubes. If you redefine temperature so that the boiling water is at 0 K but still melting ice cubes by transferring energy to the ice even though the ice is at a much hotter 250 K, then I sure wish you would call this thing that has nothing to do with average kinetic energy and which direction energy will flow something else.
To the extent that you want to define something that allows you to characterize a boiling pot of water as having either zero entropy or zero temperature, define away.
It’s not an arbitrary definition made for fun. It is—as I’ve pointed out—the only definition that is consistent. Any other set of definitions will lead to ‘paradoxes’, like Maxwell’s demon or various other ‘violations’ of the 2nd law.
I will point out that your quantities of “entropy” and “temperature” break the laws of thermodynamics in probably every respect.
On the contrary, they are the only consistent way of looking at thermodynamics.
In your system, energy can flow from a colder object to a hotter object.
And why not? Every time a battery powers an (incandescent) flashlight, energy is flowing from a colder object to a hotter object.
It seems to me the only point is to piggyback your relatively useless concepts on the well-deserved reputation of entropy and temperature in order to get them an attention they do not deserve.
The point is to put thermodynamics on a rigorous and general footing. That’s why Jaynes and others proposed MaxEnt thermodynamics.
No matter how much information I have about a pot of boiling water, it is still capable of turning a turbine with its steam, cooking rice, and melting ice cubes
These things you speak of are due to the energy in the boiling water, not the temperature, and energy is not changed no matter how much you know about the system. A system at 0 K can still carry energy. There is nothing in the laws of physics that prevents this.
And why not? Every time a battery powers an (incandescent) flashlight, energy is flowing from a colder object to a hotter object.
Actually, no. The temperature of the electrons moving in the current is quite high. At least according to the uncontroversial definitions generally used. These electrons have a lot of kinetic energy.
A system at 0 K can still carry energy. There is nothing in the laws of physics that prevents this.
Actually there is. 0 K is the state where no further energy can be extracted from the system. So a 0 K system can’t do work on any system, whether the other system is at 0 K also, or not.
Do you have in mind that a motor could be cooled down to 0 K and then run, or that a battery could be cooled down to 0 K and then run? It could be that parts of a battery or motor are at 0 K, perhaps the metal rods or cylinders of a motor are at 0 K, but the motor still turns to produce energy. But the motor itself is not at 0 K, it has motion, kinetic energy, which can be lower by its stopping running.
By the way, do you have any links to anything substantial that puts the temperature of microscopically known boiling water at 0 K? So far I’ve been contradicting your assertions without seeing the details that might lie behind them.
The temperature of the electrons moving in the current is quite high. At least according to the uncontroversial definitions generally used.
I have to say, that definition is quite new to me. The electron temperature in a piece of copper is pretty much the same as the rest of the copper, even when it’s carrying many amps of current.
But to give an even more straightforward example, think of a cold flywheel turning a hot flywheel. I suppose you’re going to say that the cold flywheel is ‘hot’ because it’s turning. I’m sorry but that’s not how thermodynamics works.
Actually there is. 0 K is the state where no further energy can be extracted from the system. So a 0 K system can’t do work on any system, whether the other system is at 0 K also, or not.
What is the exact law that says this? I’d really like to see it. The thermodynamics you’re talking about seems drastically different from the thermodynamics I learned in school.
But the motor itself is not at 0 K, it has motion, kinetic energy, which can be lower by its stopping running.
Forget a motor, just imagine an object at 0 K moving linearly through outer space.
By the way, do you have any links to anything substantial that puts the temperature of microscopically known boiling water at 0 K?
EY gives plenty of references in his linked sequences on this.
But to give an even more straightforward example, think of a cold flywheel turning a hot flywheel. I suppose you’re going to say that the cold flywheel is ‘hot’ because it’s turning. I’m sorry but that’s not how thermodynamics works.
The equipartition theorem says that a system in thermal equilibrium has energy k*T/2 per degree of freedom. Consider a rigid flywheel weighing 1 kg and spinning at “around” 1 m/s so that its kinetic energy from it’s rotation is 1 J. I’d like to say this system has 1 degree of freedom, spinning of the flywheel, and so its temperature is 1/k = 7e22 K. But in case you point out that the flywheel can be flying through space as well as spinning on any one of three axes, lets say its temperature is 7e22/6 = about 1e22 K.
A macroscopic rigid system has massively more weight than molecules in a gas but not very many degrees of freedom. If temperatures can be assigned to these at all, they are MASSIVE temperatures.
But it is not a rigid body you say, it is a solid made of atoms that can vibrate. Indeed the solid flywheel might be made of a piece of metal which is at 200 K or 300 K or whatever temperature you want to have heated it up to. But an experiment with a flywheel made of metal at 300 K which flywheel is being spun and unspun: the energy of the spinning is not “thermalizing” with the internal vibrational energy of the flywheel. It is not thermalizing which means these are thermodynamically uncoupled systems which means the effective temperature of the macroscopic rotation of the flywheel is in the 1e22 kind of range.
This IS how thermodynamics works. We don’t usually talk about thermo of macroscopic objects with very few degrees of freedom. That doesn’t mean we can’t, or even that we shouldn’t.
Actually there is. 0 K is the state where no further energy can be extracted from the system. So a 0 K system can’t do work on any system, whether the other system is at 0 K also, or not.
What is the exact law that says this? I’d really like to see it. The thermodynamics you’re talking about seems drastically different from the thermodynamics I learned in school.
Forget a motor, just imagine an object at 0 K moving linearly through outer space.
OK. As with the flywheel, a 1 kg object moving at 1 m/s through space has 1 J of kinetic energy. Even if we attribute 6 degrees of freedom to this object, that kinetic energy corresponds to about 1e22 K.
EY gives plenty of references in his linked sequences on this.
I looked through this thread and there are no links to any sequences. I searched the Wiki for “Jaynes” and there were very few references, only to mind projection fallacy. So if in fact there is any link anywhere to an argument that a pot of water with microscopically known positions and velocities is somehow at 0 K, please just point me to it.
Let me see if I can pick apart your misconceptions.
About the flywheel example, no, rotation does not lead to temperature, because the rotational energy of the flywheel is not thermal energy. You seem to be mixing up thermal with non-thermal energy. In thermodynamics we assign several different kinds of energy to a system:
Total energy: Internal energy + Potential energy + Kinetic energy
Potential energy: Energy due to external force fields (gravity, electromagnetism, etc.)
Kinetic energy: Energy due to motion of the system as a whole (linear motion, rotational motion, etc.)
Internal energy/thermal energy: The energy that is responsible for the temperature of a system.
But here’s the kicker: The division between these concepts is not a fundamental law of nature, but depends on your model. So yes, you could build a model where rotation is included in thermal energy. But then, rotation would be part of the entropy as well, so at nonzero temperature you could not model it as rotating at a fixed speed! You’d have to model the rotation as a random variable. Clearly this contradicts with rotation at a fixed speed. That is, unless you also set the temperature to 0 K, in which entropy would be zero and so you could set the rotation to a fixed speed.
Now about the relationship between internal energy and degrees of freedom. You’re misunderstanding what a degree of freedom is. The equipartition theorem says that the average energy of a particle with n degrees of freedom is nkT/2, but even if you included rotational energy as thermal energy, a large spinning object has much more than one degree of freedom. It has degrees of freedom associated with its many vibrational modes. It has so many vibrational modes that the associated ‘temperature’ is actually very low, not high as you describe. Indeed, if it were to ‘thermalize’ (say, through friction), it would not warm up the object that much. If it were true that the temperature due to rotation is 1e22, then if you let it thermalize it would violate conservation of energy, by tens of orders of magnitude (it would turn into quark-gluon plasma and explode violently, vaporizing half of the planet Earth).
And finally, you cannot calculate absolute energy for an object moving linearly through space. The kinetic energy depends on the rest frame.
Let me see if I can pick apart your misconceptions.
Ok, I have a PhD in Applied Physics. I have learned thermo and statistical mechanics a few times including two graduate level courses. I have recently been analyzing internal and external combustion engines as part of my job, and have relearned some parts of thermo for that. It may be that despite my background, I have not done a good job of explaining what is going on with thermo. But what I am explaining here is, at worst, the way a working physicist would see thermo, informed by a science that explains a shitload of reality, and in a way which is no more subjective than the “spooky action at a distance” of electromagnetic and gravitational fields. I realize appealing to my credentials is hardly an argument. However, I am pretty sure that I am right and I am pretty sure that what I have been claiming are all within spitting distance of discussions and examples of thermo and stat mech calculations and considerations that we really talked about when I was learning this stuff.
My confidence in my position is not undermined by anything you have said, so far. I have asked you for a link to something with some kind of detail that explicates the 0 K 0 entropy boiling water position, or some version of the broken concepts you are speaking generally about. You have referred only to things already linked in the thread, or in the sequence on this topic, and i have found no links in the thread that were relevant. I have asked you again to link me to something and you haven’t.
But despite your not giving me anything to work with from your side, I have believed I understand what you are claiming. For the entropy side I would characterize it this way. Standard entropy makes a list of all states at the appropriate energy of an isolated system and say there is equal probability of the system being in any of these. And so the entropy at this energy of this isolated system is log(N(E)) where N(E) is the number of states that have energy E.
I think what you are saying is that if you have detailed knowledge of which state the system is in now, then with the details you have you can predict the exact trajectory of the system through state space, and so the number of states the system can be in is 1 because you KNOW which one it must be in. And so its entropy is 0.
A version of my response would be: so you know which state the system is at any instant of time and so you feel like the entropy is log(1) at any instant in time. But the system still evolves through time through all the enumerated states. And its entropy is log(N(E)), the count of states it evolves through, and it is unchanged that you know at each instant which state it is in. So I know the details of every collision because I follow the motions in detail, but every collision results in the system changing states as every collision changes the direction and speed of two molecules in the system, and over some short time, call it a thermalization time, the system explores nearly all N(E) states. So despite our superior knowledge that gives us the time-sequence of how the system changes from state to state and when, it still explores N(E) states, and its properties of melting ice or pushing pistons is still predictable purely from knowledge of N(E), and is not helped or hurt by a detailed knowledge of the time evolution of the system, the details of how it goes about xploring all N(E) states.
I have just reread this article on Maxwell’s Demons. I note that at no point do they deviate from the classic definitions of temperature and entropy. And indeed, the message seems to be that once the demon is part of the system, the system grows classical entropy exactly as predicted, the demons themselves are engines producing the entropy increases needed to balance all equations.
Now about the relationship between internal energy and degrees of freedom. You’re misunderstanding what a degree of freedom is.
I said rotation or movement of a rigid body. By definition a rigid body doesn’t have modes of vibration in it. Of course you may think that all real bodies are not truly rigid as they are made out of molecules. But if the macroscopic motion is only weakly coupled to the vibrational modes of the material it is made of, then this is essentially saying the macroscopic and vibrational systems are insulated from each other, and so maintain there own internal temperatures which can be different from each other. Just as two gases separated by a heat-insulating wall can be at different temperatures, a feature we find often used in thermodynamic calculations.
And finally, you cannot calculate absolute energy for an object moving linearly through space. The kinetic energy depends on the rest frame.
You actually asked me to “Forget a motor, just imagine an object at 0 K moving linearly through outer space.” And so I used the example you asked me to use.
Credentials aren’t very relevant here, but if we’re going to talk about them, I have a PhD in engineering and a BS in math (minor in physics).
and in a way which is no more subjective than the “spooky action at a distance” of electromagnetic and gravitational fields.
Again, as I’ve pointed out at least once before, entropy is not subjective. Being dependent on model and information does not mean it is subjective.
And so the entropy at this energy of this isolated system is log(N(E)) where N(E) is the number of states that have energy E.
Right off the bat, this is wrong. In a continuous system the state space could be continuous (uncountably infinite) and so N(E) makes no sense. “Logarithm of the number of states of the system” is just a loose way of describing what entropy is, not a precise way.
and so the number of states the system can be in is 1 because you KNOW which one it must be in. And so its entropy is 0.
The number of states a system can be in is always 1! A system (a classical system, at least) can never be in more than one state at a time. The ‘number of states’, insofar as it is loosely used, means the size of the state space according to our model and our information about the system.
And its entropy is log(N(E)), the count of states it evolves through, and it is unchanged that you know at each instant which state it is in.
There are several things wrong with this. First of all, it assumes the ergodic hypothesis (time average = space average) and the ergodic hypothesis is not required for thermodynamics to work (although it does make a lot of physical systems easier to analyze). But it also has another problem in that it makes entropy dependent on time scale. That is, choosing a fine time scale would decrease entropy. This is not how entropy works. And at any rate, it’s not what entropy measures anyway.
I said rotation or movement of a rigid body. By definition a rigid body doesn’t have modes of vibration in it.
But I’m not assuming a rigid body. You are. There is no reason to assume a rigid body. I offered an example of a cold flywheel turning a hot flywheel, as a system where energy moves from a cold object to a hot object. You decided for some reason that the flywheels must be rigid bodies. They aren’t, at least not in my example.
Right off the bat, this is wrong. In a continuous system the state space could be continuous (uncountably infinite) and so N(E) makes no sense. “Logarithm of the number of states of the system” is just a loose way of describing what entropy is, not a precise way.
A finite system at finite energy has a finite number of states in quantum. So if we restrict ourselves only to any kind of situation which could ever be realized by human investigators in our universe, conclusions reached using discrete states are valid.
There are several things wrong with this. First of all, it assumes the ergodic hypothesis
No, I am considering all possible states N(E) of the system at energy E. Many of these states will be highly spatially anisotropic, and I am still including them in the count.
But it also has another problem in that it makes entropy dependent on time scale. That is, choosing a fine time scale would decrease entropy. This is not how entropy works. And at any rate, it’s not what entropy measures anyway.
Since you won’t show me in any detail the calculation that leads to water having 0 temperature or 0 energy if you have special knowledge of it, I can only work from my guesses about what you are talking about. And my guess is that you achieve low entropy, 0 entropy, because with sufficient special knowledge you reduce the number of possible states to 1 at any instant, the state that the system is actually in at that instant. But if you count the number of states the system has been in as time goes by, ever time two things collide and change velocity you bounce to another state, and so even with perfect knowledge of the time evolution over a long enough time, you still cover all possible N(E) states. But over an insufficiently long time you cover a smaller number of states. In fact, the behavior of states looked at on time-scales too short to get “thermalization,” that is too short to allow the system to change through a significant fraction of the available states might possibly be describably with an entropy that depended on time, but the last thing I want to do is define new things and call them entropy when they do not have the properties of the classic entropy I have been advocating for through this entire thread.
You decided for some reason that the flywheels must be rigid bodies. They aren’t, at least not in my example.
Given the length of this thread, I think it would be better if you read all the sentences in each paragraph rather than responding to one out of context.
Seriously, can’t you give me an example of your 0 K 0 entropy boiling water and tell me what you hope to know from this example that we don’t know already? We have probably gotten most of what we can get from an open-ended discussion of philosophy of thermodynamics. A real example from you would certainly restrict the field of discussion, possibly to something even worth doing. Who knows, I might look at what you have and agree with your conclusions.
Yes this is the entire point; entropy seems to be disassociated from what’s “out there.”
And no one has said otherwise. But if you consider gas+information together, you can no longer consistently say it’s at anything other than 0 K.
I think you’re misunderstanding what “in the mind” means. It does not mean that our thoughts can influence physics. Rather, it means that quantities like entropy and temperature depend (to you) on the physical model in which you’re viewing the system.
I don’t think I am misunderstanding anything. But it is possible that i am merely not misunderstanding the physics, I suppose. But I participated in the other threads and I am pretty sure I know what we are talking about.
To the extent that you want to define something that allows you to characterize a boiling pot of water as having either zero entropy or zero temperature, define away. I will “merely” point out that the words entropy and temperature have already been applied to that situation by others who have come before you and in a way which is not altered by any knowledge you may have beyond the extensive quantities of the boiling pot of water.
I will point out that your quantities of “entropy” and “temperature” break the laws of thermodynamics in probably every respect. In your system, energy can flow from a colder object to a hotter object. In your system, entropy can decrease in a closed system. In summary, not only are your definitions of entropy and temperature confusing a rather difficult but unconfused subject, but they are also violating all the relationships that people versed in thermodynamics carry around about entropy and temperature.
So what is the possible point of calling your newly defined quantities entropy and temperature? It seems to me the only point is to piggyback your relatively useless concepts on the well-deserved reputation of entropy and temperature in order to get them an attention they do not deserve.
No matter how much information I have about a pot of boiling water, it is still capable of turning a turbine with its steam, cooking rice, and melting ice cubes. If you redefine temperature so that the boiling water is at 0 K but still melting ice cubes by transferring energy to the ice even though the ice is at a much hotter 250 K, then I sure wish you would call this thing that has nothing to do with average kinetic energy and which direction energy will flow something else.
It’s not an arbitrary definition made for fun. It is—as I’ve pointed out—the only definition that is consistent. Any other set of definitions will lead to ‘paradoxes’, like Maxwell’s demon or various other ‘violations’ of the 2nd law.
On the contrary, they are the only consistent way of looking at thermodynamics.
And why not? Every time a battery powers an (incandescent) flashlight, energy is flowing from a colder object to a hotter object.
The point is to put thermodynamics on a rigorous and general footing. That’s why Jaynes and others proposed MaxEnt thermodynamics.
These things you speak of are due to the energy in the boiling water, not the temperature, and energy is not changed no matter how much you know about the system. A system at 0 K can still carry energy. There is nothing in the laws of physics that prevents this.
Actually, no. The temperature of the electrons moving in the current is quite high. At least according to the uncontroversial definitions generally used. These electrons have a lot of kinetic energy.
Actually there is. 0 K is the state where no further energy can be extracted from the system. So a 0 K system can’t do work on any system, whether the other system is at 0 K also, or not.
Do you have in mind that a motor could be cooled down to 0 K and then run, or that a battery could be cooled down to 0 K and then run? It could be that parts of a battery or motor are at 0 K, perhaps the metal rods or cylinders of a motor are at 0 K, but the motor still turns to produce energy. But the motor itself is not at 0 K, it has motion, kinetic energy, which can be lower by its stopping running.
By the way, do you have any links to anything substantial that puts the temperature of microscopically known boiling water at 0 K? So far I’ve been contradicting your assertions without seeing the details that might lie behind them.
I have to say, that definition is quite new to me. The electron temperature in a piece of copper is pretty much the same as the rest of the copper, even when it’s carrying many amps of current.
But to give an even more straightforward example, think of a cold flywheel turning a hot flywheel. I suppose you’re going to say that the cold flywheel is ‘hot’ because it’s turning. I’m sorry but that’s not how thermodynamics works.
What is the exact law that says this? I’d really like to see it. The thermodynamics you’re talking about seems drastically different from the thermodynamics I learned in school.
Forget a motor, just imagine an object at 0 K moving linearly through outer space.
EY gives plenty of references in his linked sequences on this.
The equipartition theorem says that a system in thermal equilibrium has energy k*T/2 per degree of freedom. Consider a rigid flywheel weighing 1 kg and spinning at “around” 1 m/s so that its kinetic energy from it’s rotation is 1 J. I’d like to say this system has 1 degree of freedom, spinning of the flywheel, and so its temperature is 1/k = 7e22 K. But in case you point out that the flywheel can be flying through space as well as spinning on any one of three axes, lets say its temperature is 7e22/6 = about 1e22 K.
A macroscopic rigid system has massively more weight than molecules in a gas but not very many degrees of freedom. If temperatures can be assigned to these at all, they are MASSIVE temperatures.
But it is not a rigid body you say, it is a solid made of atoms that can vibrate. Indeed the solid flywheel might be made of a piece of metal which is at 200 K or 300 K or whatever temperature you want to have heated it up to. But an experiment with a flywheel made of metal at 300 K which flywheel is being spun and unspun: the energy of the spinning is not “thermalizing” with the internal vibrational energy of the flywheel. It is not thermalizing which means these are thermodynamically uncoupled systems which means the effective temperature of the macroscopic rotation of the flywheel is in the 1e22 kind of range.
This IS how thermodynamics works. We don’t usually talk about thermo of macroscopic objects with very few degrees of freedom. That doesn’t mean we can’t, or even that we shouldn’t.
See for example http://physics.about.com/od/glossary/g/absolutezero.htm “Absolute zero is the lowest possible temperature, at which point the atoms of a substance transmit no thermal energy—they are completely at rest.”
OK. As with the flywheel, a 1 kg object moving at 1 m/s through space has 1 J of kinetic energy. Even if we attribute 6 degrees of freedom to this object, that kinetic energy corresponds to about 1e22 K.
I looked through this thread and there are no links to any sequences. I searched the Wiki for “Jaynes” and there were very few references, only to mind projection fallacy. So if in fact there is any link anywhere to an argument that a pot of water with microscopically known positions and velocities is somehow at 0 K, please just point me to it.
Let me see if I can pick apart your misconceptions.
About the flywheel example, no, rotation does not lead to temperature, because the rotational energy of the flywheel is not thermal energy. You seem to be mixing up thermal with non-thermal energy. In thermodynamics we assign several different kinds of energy to a system:
Total energy: Internal energy + Potential energy + Kinetic energy
Potential energy: Energy due to external force fields (gravity, electromagnetism, etc.)
Kinetic energy: Energy due to motion of the system as a whole (linear motion, rotational motion, etc.)
Internal energy/thermal energy: The energy that is responsible for the temperature of a system.
But here’s the kicker: The division between these concepts is not a fundamental law of nature, but depends on your model. So yes, you could build a model where rotation is included in thermal energy. But then, rotation would be part of the entropy as well, so at nonzero temperature you could not model it as rotating at a fixed speed! You’d have to model the rotation as a random variable. Clearly this contradicts with rotation at a fixed speed. That is, unless you also set the temperature to 0 K, in which entropy would be zero and so you could set the rotation to a fixed speed.
Now about the relationship between internal energy and degrees of freedom. You’re misunderstanding what a degree of freedom is. The equipartition theorem says that the average energy of a particle with n degrees of freedom is nkT/2, but even if you included rotational energy as thermal energy, a large spinning object has much more than one degree of freedom. It has degrees of freedom associated with its many vibrational modes. It has so many vibrational modes that the associated ‘temperature’ is actually very low, not high as you describe. Indeed, if it were to ‘thermalize’ (say, through friction), it would not warm up the object that much. If it were true that the temperature due to rotation is 1e22, then if you let it thermalize it would violate conservation of energy, by tens of orders of magnitude (it would turn into quark-gluon plasma and explode violently, vaporizing half of the planet Earth).
And finally, you cannot calculate absolute energy for an object moving linearly through space. The kinetic energy depends on the rest frame.
Ok, I have a PhD in Applied Physics. I have learned thermo and statistical mechanics a few times including two graduate level courses. I have recently been analyzing internal and external combustion engines as part of my job, and have relearned some parts of thermo for that. It may be that despite my background, I have not done a good job of explaining what is going on with thermo. But what I am explaining here is, at worst, the way a working physicist would see thermo, informed by a science that explains a shitload of reality, and in a way which is no more subjective than the “spooky action at a distance” of electromagnetic and gravitational fields. I realize appealing to my credentials is hardly an argument. However, I am pretty sure that I am right and I am pretty sure that what I have been claiming are all within spitting distance of discussions and examples of thermo and stat mech calculations and considerations that we really talked about when I was learning this stuff.
My confidence in my position is not undermined by anything you have said, so far. I have asked you for a link to something with some kind of detail that explicates the 0 K 0 entropy boiling water position, or some version of the broken concepts you are speaking generally about. You have referred only to things already linked in the thread, or in the sequence on this topic, and i have found no links in the thread that were relevant. I have asked you again to link me to something and you haven’t.
But despite your not giving me anything to work with from your side, I have believed I understand what you are claiming. For the entropy side I would characterize it this way. Standard entropy makes a list of all states at the appropriate energy of an isolated system and say there is equal probability of the system being in any of these. And so the entropy at this energy of this isolated system is log(N(E)) where N(E) is the number of states that have energy E.
I think what you are saying is that if you have detailed knowledge of which state the system is in now, then with the details you have you can predict the exact trajectory of the system through state space, and so the number of states the system can be in is 1 because you KNOW which one it must be in. And so its entropy is 0.
A version of my response would be: so you know which state the system is at any instant of time and so you feel like the entropy is log(1) at any instant in time. But the system still evolves through time through all the enumerated states. And its entropy is log(N(E)), the count of states it evolves through, and it is unchanged that you know at each instant which state it is in. So I know the details of every collision because I follow the motions in detail, but every collision results in the system changing states as every collision changes the direction and speed of two molecules in the system, and over some short time, call it a thermalization time, the system explores nearly all N(E) states. So despite our superior knowledge that gives us the time-sequence of how the system changes from state to state and when, it still explores N(E) states, and its properties of melting ice or pushing pistons is still predictable purely from knowledge of N(E), and is not helped or hurt by a detailed knowledge of the time evolution of the system, the details of how it goes about xploring all N(E) states.
I have just reread this article on Maxwell’s Demons. I note that at no point do they deviate from the classic definitions of temperature and entropy. And indeed, the message seems to be that once the demon is part of the system, the system grows classical entropy exactly as predicted, the demons themselves are engines producing the entropy increases needed to balance all equations.
I said rotation or movement of a rigid body. By definition a rigid body doesn’t have modes of vibration in it. Of course you may think that all real bodies are not truly rigid as they are made out of molecules. But if the macroscopic motion is only weakly coupled to the vibrational modes of the material it is made of, then this is essentially saying the macroscopic and vibrational systems are insulated from each other, and so maintain there own internal temperatures which can be different from each other. Just as two gases separated by a heat-insulating wall can be at different temperatures, a feature we find often used in thermodynamic calculations.
You actually asked me to “Forget a motor, just imagine an object at 0 K moving linearly through outer space.” And so I used the example you asked me to use.
Credentials aren’t very relevant here, but if we’re going to talk about them, I have a PhD in engineering and a BS in math (minor in physics).
Again, as I’ve pointed out at least once before, entropy is not subjective. Being dependent on model and information does not mean it is subjective.
Right off the bat, this is wrong. In a continuous system the state space could be continuous (uncountably infinite) and so N(E) makes no sense. “Logarithm of the number of states of the system” is just a loose way of describing what entropy is, not a precise way.
The number of states a system can be in is always 1! A system (a classical system, at least) can never be in more than one state at a time. The ‘number of states’, insofar as it is loosely used, means the size of the state space according to our model and our information about the system.
There are several things wrong with this. First of all, it assumes the ergodic hypothesis (time average = space average) and the ergodic hypothesis is not required for thermodynamics to work (although it does make a lot of physical systems easier to analyze). But it also has another problem in that it makes entropy dependent on time scale. That is, choosing a fine time scale would decrease entropy. This is not how entropy works. And at any rate, it’s not what entropy measures anyway.
But I’m not assuming a rigid body. You are. There is no reason to assume a rigid body. I offered an example of a cold flywheel turning a hot flywheel, as a system where energy moves from a cold object to a hot object. You decided for some reason that the flywheels must be rigid bodies. They aren’t, at least not in my example.
A finite system at finite energy has a finite number of states in quantum. So if we restrict ourselves only to any kind of situation which could ever be realized by human investigators in our universe, conclusions reached using discrete states are valid.
No, I am considering all possible states N(E) of the system at energy E. Many of these states will be highly spatially anisotropic, and I am still including them in the count.
Since you won’t show me in any detail the calculation that leads to water having 0 temperature or 0 energy if you have special knowledge of it, I can only work from my guesses about what you are talking about. And my guess is that you achieve low entropy, 0 entropy, because with sufficient special knowledge you reduce the number of possible states to 1 at any instant, the state that the system is actually in at that instant. But if you count the number of states the system has been in as time goes by, ever time two things collide and change velocity you bounce to another state, and so even with perfect knowledge of the time evolution over a long enough time, you still cover all possible N(E) states. But over an insufficiently long time you cover a smaller number of states. In fact, the behavior of states looked at on time-scales too short to get “thermalization,” that is too short to allow the system to change through a significant fraction of the available states might possibly be describably with an entropy that depended on time, but the last thing I want to do is define new things and call them entropy when they do not have the properties of the classic entropy I have been advocating for through this entire thread.
Given the length of this thread, I think it would be better if you read all the sentences in each paragraph rather than responding to one out of context.
Seriously, can’t you give me an example of your 0 K 0 entropy boiling water and tell me what you hope to know from this example that we don’t know already? We have probably gotten most of what we can get from an open-ended discussion of philosophy of thermodynamics. A real example from you would certainly restrict the field of discussion, possibly to something even worth doing. Who knows, I might look at what you have and agree with your conclusions.