I feel like this may be a semantics issue. I think that order implies information. To me, saying that a system becomes more ordered implies that I know about the increased order somehow. Under that construction, disorder (i.e. the absence of detectable patterns) is a measure of ignorance and disorder then is closely related to entropy. You may be preserving a distinction between the map and territory (i.e. between the system and our knowledge of the system) that I’m neglecting. I’m not sure which framework is more useful/productive.
I think it’s definitely an important distinction to be aware of either way.
‘order’ is not a well-defined concept. One person’s order is another’s chaos. Entropy, on the other hand, is a well-defined concept.
Even though entropy depends on the information you have about the system, the way that it depends on that is not subjective, and any two observers with the same amount of information about the system must come up with the exact same quantity for entropy.
All of this might seem counter-intuitive at first but it makes sense when you realize that Entropy(system) isn’t well-defined, but Entropy(system, model) is precisely defined. The ‘model’ is what Bayesians would call the prior. It is always there, either implicitly or explicitly.
I feel like this may be a semantics issue. I think that order implies information. To me, saying that a system becomes more ordered implies that I know about the increased order somehow. Under that construction, disorder (i.e. the absence of detectable patterns) is a measure of ignorance and disorder then is closely related to entropy. You may be preserving a distinction between the map and territory (i.e. between the system and our knowledge of the system) that I’m neglecting. I’m not sure which framework is more useful/productive.
I think it’s definitely an important distinction to be aware of either way.
‘order’ is not a well-defined concept. One person’s order is another’s chaos. Entropy, on the other hand, is a well-defined concept.
Even though entropy depends on the information you have about the system, the way that it depends on that is not subjective, and any two observers with the same amount of information about the system must come up with the exact same quantity for entropy.
All of this might seem counter-intuitive at first but it makes sense when you realize that Entropy(system) isn’t well-defined, but Entropy(system, model) is precisely defined. The ‘model’ is what Bayesians would call the prior. It is always there, either implicitly or explicitly.