Credentials aren’t very relevant here, but if we’re going to talk about them, I have a PhD in engineering and a BS in math (minor in physics).
and in a way which is no more subjective than the “spooky action at a distance” of electromagnetic and gravitational fields.
Again, as I’ve pointed out at least once before, entropy is not subjective. Being dependent on model and information does not mean it is subjective.
And so the entropy at this energy of this isolated system is log(N(E)) where N(E) is the number of states that have energy E.
Right off the bat, this is wrong. In a continuous system the state space could be continuous (uncountably infinite) and so N(E) makes no sense. “Logarithm of the number of states of the system” is just a loose way of describing what entropy is, not a precise way.
and so the number of states the system can be in is 1 because you KNOW which one it must be in. And so its entropy is 0.
The number of states a system can be in is always 1! A system (a classical system, at least) can never be in more than one state at a time. The ‘number of states’, insofar as it is loosely used, means the size of the state space according to our model and our information about the system.
And its entropy is log(N(E)), the count of states it evolves through, and it is unchanged that you know at each instant which state it is in.
There are several things wrong with this. First of all, it assumes the ergodic hypothesis (time average = space average) and the ergodic hypothesis is not required for thermodynamics to work (although it does make a lot of physical systems easier to analyze). But it also has another problem in that it makes entropy dependent on time scale. That is, choosing a fine time scale would decrease entropy. This is not how entropy works. And at any rate, it’s not what entropy measures anyway.
I said rotation or movement of a rigid body. By definition a rigid body doesn’t have modes of vibration in it.
But I’m not assuming a rigid body. You are. There is no reason to assume a rigid body. I offered an example of a cold flywheel turning a hot flywheel, as a system where energy moves from a cold object to a hot object. You decided for some reason that the flywheels must be rigid bodies. They aren’t, at least not in my example.
Right off the bat, this is wrong. In a continuous system the state space could be continuous (uncountably infinite) and so N(E) makes no sense. “Logarithm of the number of states of the system” is just a loose way of describing what entropy is, not a precise way.
A finite system at finite energy has a finite number of states in quantum. So if we restrict ourselves only to any kind of situation which could ever be realized by human investigators in our universe, conclusions reached using discrete states are valid.
There are several things wrong with this. First of all, it assumes the ergodic hypothesis
No, I am considering all possible states N(E) of the system at energy E. Many of these states will be highly spatially anisotropic, and I am still including them in the count.
But it also has another problem in that it makes entropy dependent on time scale. That is, choosing a fine time scale would decrease entropy. This is not how entropy works. And at any rate, it’s not what entropy measures anyway.
Since you won’t show me in any detail the calculation that leads to water having 0 temperature or 0 energy if you have special knowledge of it, I can only work from my guesses about what you are talking about. And my guess is that you achieve low entropy, 0 entropy, because with sufficient special knowledge you reduce the number of possible states to 1 at any instant, the state that the system is actually in at that instant. But if you count the number of states the system has been in as time goes by, ever time two things collide and change velocity you bounce to another state, and so even with perfect knowledge of the time evolution over a long enough time, you still cover all possible N(E) states. But over an insufficiently long time you cover a smaller number of states. In fact, the behavior of states looked at on time-scales too short to get “thermalization,” that is too short to allow the system to change through a significant fraction of the available states might possibly be describably with an entropy that depended on time, but the last thing I want to do is define new things and call them entropy when they do not have the properties of the classic entropy I have been advocating for through this entire thread.
You decided for some reason that the flywheels must be rigid bodies. They aren’t, at least not in my example.
Given the length of this thread, I think it would be better if you read all the sentences in each paragraph rather than responding to one out of context.
Seriously, can’t you give me an example of your 0 K 0 entropy boiling water and tell me what you hope to know from this example that we don’t know already? We have probably gotten most of what we can get from an open-ended discussion of philosophy of thermodynamics. A real example from you would certainly restrict the field of discussion, possibly to something even worth doing. Who knows, I might look at what you have and agree with your conclusions.
Credentials aren’t very relevant here, but if we’re going to talk about them, I have a PhD in engineering and a BS in math (minor in physics).
Again, as I’ve pointed out at least once before, entropy is not subjective. Being dependent on model and information does not mean it is subjective.
Right off the bat, this is wrong. In a continuous system the state space could be continuous (uncountably infinite) and so N(E) makes no sense. “Logarithm of the number of states of the system” is just a loose way of describing what entropy is, not a precise way.
The number of states a system can be in is always 1! A system (a classical system, at least) can never be in more than one state at a time. The ‘number of states’, insofar as it is loosely used, means the size of the state space according to our model and our information about the system.
There are several things wrong with this. First of all, it assumes the ergodic hypothesis (time average = space average) and the ergodic hypothesis is not required for thermodynamics to work (although it does make a lot of physical systems easier to analyze). But it also has another problem in that it makes entropy dependent on time scale. That is, choosing a fine time scale would decrease entropy. This is not how entropy works. And at any rate, it’s not what entropy measures anyway.
But I’m not assuming a rigid body. You are. There is no reason to assume a rigid body. I offered an example of a cold flywheel turning a hot flywheel, as a system where energy moves from a cold object to a hot object. You decided for some reason that the flywheels must be rigid bodies. They aren’t, at least not in my example.
A finite system at finite energy has a finite number of states in quantum. So if we restrict ourselves only to any kind of situation which could ever be realized by human investigators in our universe, conclusions reached using discrete states are valid.
No, I am considering all possible states N(E) of the system at energy E. Many of these states will be highly spatially anisotropic, and I am still including them in the count.
Since you won’t show me in any detail the calculation that leads to water having 0 temperature or 0 energy if you have special knowledge of it, I can only work from my guesses about what you are talking about. And my guess is that you achieve low entropy, 0 entropy, because with sufficient special knowledge you reduce the number of possible states to 1 at any instant, the state that the system is actually in at that instant. But if you count the number of states the system has been in as time goes by, ever time two things collide and change velocity you bounce to another state, and so even with perfect knowledge of the time evolution over a long enough time, you still cover all possible N(E) states. But over an insufficiently long time you cover a smaller number of states. In fact, the behavior of states looked at on time-scales too short to get “thermalization,” that is too short to allow the system to change through a significant fraction of the available states might possibly be describably with an entropy that depended on time, but the last thing I want to do is define new things and call them entropy when they do not have the properties of the classic entropy I have been advocating for through this entire thread.
Given the length of this thread, I think it would be better if you read all the sentences in each paragraph rather than responding to one out of context.
Seriously, can’t you give me an example of your 0 K 0 entropy boiling water and tell me what you hope to know from this example that we don’t know already? We have probably gotten most of what we can get from an open-ended discussion of philosophy of thermodynamics. A real example from you would certainly restrict the field of discussion, possibly to something even worth doing. Who knows, I might look at what you have and agree with your conclusions.