OK this is in fact interesting. In an important sense you have already won, or I have learned something, whichever description you find less objectionable.
I still think that the real definition of entropy is as you originally said, the log of the number of allowable states, where allowable means “at the same total energy as the starting state has.” To the extent entropy is then used to calculate the dynamics of a system, this unambiguous definition will apply when the system moves smoothly and slowly from one thermal equilibrium to another, as some macrosopic component of the system changes “slowly,” slowly enough that all intermediate steps look like thermal equilibria, also known in the trade as “reversibly.”
But your “10 ms after the partition removed” statement highlights that the kinds of dynamics you are thinking of are not reversible, not the dynamics of systems in thermal equilibrium. Soon after the partition is removed, you have a region that used to be vacuum that has only fast moving molecules in it, the slow moving ones from the distribution haven’t had time to get there yet! Soon after that when the fast molecules are first reaching the far wall, you have some interesting mixing going on involving fast molecules bouncing off the wall and hitting slower molecules still heading towards the wall. And in a frame by frame sense, and so on and so on.
Eventually (seconds? Less?) zillions (that’s a technical term) of collisions have occurred and the distributions of molecular speeds in any small region of the large volume is a thermal distribution, at a lower temperature than the original distribution before the partition was removed (gasses cool on expansion). But the details of how the system got to this new equilibrium are lost. The system has thermalized, come to a new thermal equilibrium.
I would still maintain that formally, the log of the number of states is a fine definition, that the entropy thus defined is as unambiguous as “Energy,” and that it is as useful as energy.
If you start modifying the “entropy,” if you start counting some states more than others, there are two reasons that might make sense. 1) you are interested in non-thermal-equilibrium dynamics, and given a particular starting state for the system, you want to count only the parameter states the system could reach in some particular short time frame, or 2) you are equivocating, pretending that your more complete knowledge of the starting point of the system than someone else had gives your entropy calculation an “in your mind” component when all it does is mean at least one of the minds making the calculation was making it for a different system than the one in front of them.
In the case of 1) non-equilibrium dynamics is certainly a reasonable thing to be interested in. However, the utility of the previously and unambiguously defined entropy in calculating the dynamics of systems which reach thermal equilibrium is so high that it really is up to those who would modify it, to modify the name describing it as well. So the entropy-like calculation that counts only states reachable after 10 ms might be called the “prompt entropy” or the “evolving entropy.” It really isn’t reasonable to just call it entropy and then claim an “in your mind” component to the property of entropy, because in your mind you are actually doing something different from what everybody else is doing.
In the case of 2), where you look in detail at the system and see a different set of states the system can get to than someone else who looked at the system saw, then it is not a matter of entropy being in your mind that distinguishes, it is a situation of one of you being wrong about what the entropy is. And my calling an orange “an Apple” no more makes Apple ambiguous than my saying 2+2=5 calls into question the objective truth of addition.
As to the machine that subtracts extra energy… Consider an air jet blowing a stream of high pressure air into a chamber with a piston in it. THe piston can move and you can extract energy. Someone using thermo to build an engine based on this might just calculate the rise in pressure in the volume as the air jet blows into it, and put the piston in a place where the air jet is not blowing directly on to it, and might then find their machine performs in a way you would expect from a thermo calculation. I.e. they might build their machine so the energy from air jet is “thermalized” with the rest of the air in the volume before it pushes on the piston. Somebody else might look at this and think “I’m putting the piston lined up with the air jet so the air jet blows right on to the piston.” They might well extract MORE energy from the motion of the piston then the person who did the thermo calculation and placed their piston out of direct air flow. I think in every sense, the person exploiting the direct air flow from the air jet is building his super thermo machine exploiting his detailed knowledge of the state of the air in the chamber. I believe THIS is a picture you should have in your mind as you read all this stuff about Bayesian probability and entropy in the mind. And my comment on it is this: there are plenty of machines that are non-thermo. Thermo applies to steam engines and internal combustion engines when the working fluids thermalize faster than the mechanical components move. But a bicycle, being pumped by your legs, is not a thermo machine. THere is some quite non-thermalized chemistry going on in your muscles that causes motions of the pedals and gears that are MUCH higher than any local temperatures would predict, and which do interesting things on a MUCH faster time scale than the energy involved can leak out and thermalize the rest of the system.
THere is no special “in the mind” component of this non-thermo-equilibrium air jet machine. Anybody who sees the machine I have built where the air jet blows directly on the piston, who analyzes the machine, will calculate the same performance of the machine if they have the same gas-dynamics simulation code that I have. THey will recognize that this machine is not using a thermalized volume of gas to press the piston, that it is using a very not-in-equilibrium stream of fast gas to push the piston harder.
In conclusion: the kinds of special knowledge invoked to make Entropy an “in your mind” quantity are really going beyond the traditional objective definition of Entropy and just failing to give this new different quantity a a new different name. This represents an equivocation, not a subjective component to entropy, just as someone changing the definition of apple to include oranges is not proving the subjectivity of the concept of Apple, they are simply using words differently than the people they are talking to and forgetting to mention that.
Further, the particular “special knowledge of details” physics discussed is not anything new. It is mechanics. Thermodynamics is a subclass of mechanics useful for analyzing system dynamics where fluids interact internally much faster than they act on the pieces of the machine they are pushing. In these cases thermodynamic calculations. But when the details of the system are that it is NOT in thermodynamic equilibrium as it interacts with the moving parts of a machine, this does not make entropy subjective, it makes entropy a more difficult to use tool in the analysis, just as an Apple pealer is not so useful to a guy who thinks oranges are a kind of apple.
Finally, there is an intermediate realm of mechanics where fluids are used and they are partially thermalized, but not completely because the dynamics of the rest of the machine are comparable to thermalization times. There might be interesting extensions from the concepts of entropy that could be useful in calculating the dynamics of these systems. But the fact that only one of two minds in a room is thinking these thoughts at a given moment does not make either the original entropy concept or these new extensions any more “in the mind” then is Energy. It just means the two minds need to each understand this new physics for this intermediate case but when they do they will be using unambiguous definitions for “prompt entropy” or whatever they call it.
OK this is in fact interesting. In an important sense you have already won, or I have learned something, whichever description you find less objectionable.
I still think that the real definition of entropy is as you originally said, the log of the number of allowable states, where allowable means “at the same total energy as the starting state has.” To the extent entropy is then used to calculate the dynamics of a system, this unambiguous definition will apply when the system moves smoothly and slowly from one thermal equilibrium to another, as some macrosopic component of the system changes “slowly,” slowly enough that all intermediate steps look like thermal equilibria, also known in the trade as “reversibly.”
But your “10 ms after the partition removed” statement highlights that the kinds of dynamics you are thinking of are not reversible, not the dynamics of systems in thermal equilibrium. Soon after the partition is removed, you have a region that used to be vacuum that has only fast moving molecules in it, the slow moving ones from the distribution haven’t had time to get there yet! Soon after that when the fast molecules are first reaching the far wall, you have some interesting mixing going on involving fast molecules bouncing off the wall and hitting slower molecules still heading towards the wall. And in a frame by frame sense, and so on and so on.
Eventually (seconds? Less?) zillions (that’s a technical term) of collisions have occurred and the distributions of molecular speeds in any small region of the large volume is a thermal distribution, at a lower temperature than the original distribution before the partition was removed (gasses cool on expansion). But the details of how the system got to this new equilibrium are lost. The system has thermalized, come to a new thermal equilibrium.
I would still maintain that formally, the log of the number of states is a fine definition, that the entropy thus defined is as unambiguous as “Energy,” and that it is as useful as energy.
If you start modifying the “entropy,” if you start counting some states more than others, there are two reasons that might make sense. 1) you are interested in non-thermal-equilibrium dynamics, and given a particular starting state for the system, you want to count only the parameter states the system could reach in some particular short time frame, or 2) you are equivocating, pretending that your more complete knowledge of the starting point of the system than someone else had gives your entropy calculation an “in your mind” component when all it does is mean at least one of the minds making the calculation was making it for a different system than the one in front of them.
In the case of 1) non-equilibrium dynamics is certainly a reasonable thing to be interested in. However, the utility of the previously and unambiguously defined entropy in calculating the dynamics of systems which reach thermal equilibrium is so high that it really is up to those who would modify it, to modify the name describing it as well. So the entropy-like calculation that counts only states reachable after 10 ms might be called the “prompt entropy” or the “evolving entropy.” It really isn’t reasonable to just call it entropy and then claim an “in your mind” component to the property of entropy, because in your mind you are actually doing something different from what everybody else is doing.
In the case of 2), where you look in detail at the system and see a different set of states the system can get to than someone else who looked at the system saw, then it is not a matter of entropy being in your mind that distinguishes, it is a situation of one of you being wrong about what the entropy is. And my calling an orange “an Apple” no more makes Apple ambiguous than my saying 2+2=5 calls into question the objective truth of addition.
As to the machine that subtracts extra energy… Consider an air jet blowing a stream of high pressure air into a chamber with a piston in it. THe piston can move and you can extract energy. Someone using thermo to build an engine based on this might just calculate the rise in pressure in the volume as the air jet blows into it, and put the piston in a place where the air jet is not blowing directly on to it, and might then find their machine performs in a way you would expect from a thermo calculation. I.e. they might build their machine so the energy from air jet is “thermalized” with the rest of the air in the volume before it pushes on the piston. Somebody else might look at this and think “I’m putting the piston lined up with the air jet so the air jet blows right on to the piston.” They might well extract MORE energy from the motion of the piston then the person who did the thermo calculation and placed their piston out of direct air flow. I think in every sense, the person exploiting the direct air flow from the air jet is building his super thermo machine exploiting his detailed knowledge of the state of the air in the chamber. I believe THIS is a picture you should have in your mind as you read all this stuff about Bayesian probability and entropy in the mind. And my comment on it is this: there are plenty of machines that are non-thermo. Thermo applies to steam engines and internal combustion engines when the working fluids thermalize faster than the mechanical components move. But a bicycle, being pumped by your legs, is not a thermo machine. THere is some quite non-thermalized chemistry going on in your muscles that causes motions of the pedals and gears that are MUCH higher than any local temperatures would predict, and which do interesting things on a MUCH faster time scale than the energy involved can leak out and thermalize the rest of the system.
THere is no special “in the mind” component of this non-thermo-equilibrium air jet machine. Anybody who sees the machine I have built where the air jet blows directly on the piston, who analyzes the machine, will calculate the same performance of the machine if they have the same gas-dynamics simulation code that I have. THey will recognize that this machine is not using a thermalized volume of gas to press the piston, that it is using a very not-in-equilibrium stream of fast gas to push the piston harder.
In conclusion: the kinds of special knowledge invoked to make Entropy an “in your mind” quantity are really going beyond the traditional objective definition of Entropy and just failing to give this new different quantity a a new different name. This represents an equivocation, not a subjective component to entropy, just as someone changing the definition of apple to include oranges is not proving the subjectivity of the concept of Apple, they are simply using words differently than the people they are talking to and forgetting to mention that.
Further, the particular “special knowledge of details” physics discussed is not anything new. It is mechanics. Thermodynamics is a subclass of mechanics useful for analyzing system dynamics where fluids interact internally much faster than they act on the pieces of the machine they are pushing. In these cases thermodynamic calculations. But when the details of the system are that it is NOT in thermodynamic equilibrium as it interacts with the moving parts of a machine, this does not make entropy subjective, it makes entropy a more difficult to use tool in the analysis, just as an Apple pealer is not so useful to a guy who thinks oranges are a kind of apple.
Finally, there is an intermediate realm of mechanics where fluids are used and they are partially thermalized, but not completely because the dynamics of the rest of the machine are comparable to thermalization times. There might be interesting extensions from the concepts of entropy that could be useful in calculating the dynamics of these systems. But the fact that only one of two minds in a room is thinking these thoughts at a given moment does not make either the original entropy concept or these new extensions any more “in the mind” then is Energy. It just means the two minds need to each understand this new physics for this intermediate case but when they do they will be using unambiguous definitions for “prompt entropy” or whatever they call it.