But if I know that all the gas molecules are in one half of the container, then I can move a piston for free and then as the gas expands to fill the container again I can extract useful work. It seems like if I know about this increase in order it definitely constitutes a decrease in entropy.
If you know precisely when this increase in order will occur then your knowledge about the system is necessarily very high and your entropy is necessarily very low (probably close to zero) to begin with.
I feel like this may be a semantics issue. I think that order implies information. To me, saying that a system becomes more ordered implies that I know about the increased order somehow. Under that construction, disorder (i.e. the absence of detectable patterns) is a measure of ignorance and disorder then is closely related to entropy. You may be preserving a distinction between the map and territory (i.e. between the system and our knowledge of the system) that I’m neglecting. I’m not sure which framework is more useful/productive.
I think it’s definitely an important distinction to be aware of either way.
‘order’ is not a well-defined concept. One person’s order is another’s chaos. Entropy, on the other hand, is a well-defined concept.
Even though entropy depends on the information you have about the system, the way that it depends on that is not subjective, and any two observers with the same amount of information about the system must come up with the exact same quantity for entropy.
All of this might seem counter-intuitive at first but it makes sense when you realize that Entropy(system) isn’t well-defined, but Entropy(system, model) is precisely defined. The ‘model’ is what Bayesians would call the prior. It is always there, either implicitly or explicitly.
But if I know that all the gas molecules are in one half of the container, then I can move a piston for free and then as the gas expands to fill the container again I can extract useful work. It seems like if I know about this increase in order it definitely constitutes a decrease in entropy.
If you know precisely when this increase in order will occur then your knowledge about the system is necessarily very high and your entropy is necessarily very low (probably close to zero) to begin with.
I feel like this may be a semantics issue. I think that order implies information. To me, saying that a system becomes more ordered implies that I know about the increased order somehow. Under that construction, disorder (i.e. the absence of detectable patterns) is a measure of ignorance and disorder then is closely related to entropy. You may be preserving a distinction between the map and territory (i.e. between the system and our knowledge of the system) that I’m neglecting. I’m not sure which framework is more useful/productive.
I think it’s definitely an important distinction to be aware of either way.
‘order’ is not a well-defined concept. One person’s order is another’s chaos. Entropy, on the other hand, is a well-defined concept.
Even though entropy depends on the information you have about the system, the way that it depends on that is not subjective, and any two observers with the same amount of information about the system must come up with the exact same quantity for entropy.
All of this might seem counter-intuitive at first but it makes sense when you realize that Entropy(system) isn’t well-defined, but Entropy(system, model) is precisely defined. The ‘model’ is what Bayesians would call the prior. It is always there, either implicitly or explicitly.