#5? “Entropy always increases over time”=>”the disorder in a system always increases over time,” or “the number of piecewise arrangements that you effectively can’t tell the difference between always increases over time, in a closed system.”
“Disorder” isn’t very clear or helpful, really. Still, a physicist or chemist is likely to be able to give an account of entropy that is coherent and not controversial among scientists. On the other hand, most attempts to explain entropy by non-scientists would probably satisfy the fifth criterion. But that there is any class of people who seem to be able to use the word in a meaningful way seems to distinguish “entropy” from “emergence.”
Not as a definition. But many explanations which use “entropy” could also use “disorder” without becoming overtly incoherent or contradicting accounts given by most others; which was the requirement of #5. Of those explanations which use “entropy” in a more technical sense, many could go with my second example; and the rest could use something more specific, like an information-theoretic epression, or a physical prediction.
But many explanations which use “entropy” could also use “disorder” without becoming overtly incoherent or contradicting accounts given by most others; which was the requirement of #5.
That works for physical entropy. For the sense of entropy used in information theory, a better substitution would be uncertainty.
“Entropy” fulfils all the criteria. Despite this some people manage to use it effectively.
EDIT: The word “some” was supposed to be in there!
#5? “Entropy always increases over time”=>”the disorder in a system always increases over time,” or “the number of piecewise arrangements that you effectively can’t tell the difference between always increases over time, in a closed system.”
“Disorder” isn’t very clear or helpful, really. Still, a physicist or chemist is likely to be able to give an account of entropy that is coherent and not controversial among scientists. On the other hand, most attempts to explain entropy by non-scientists would probably satisfy the fifth criterion. But that there is any class of people who seem to be able to use the word in a meaningful way seems to distinguish “entropy” from “emergence.”
Not as a definition. But many explanations which use “entropy” could also use “disorder” without becoming overtly incoherent or contradicting accounts given by most others; which was the requirement of #5. Of those explanations which use “entropy” in a more technical sense, many could go with my second example; and the rest could use something more specific, like an information-theoretic epression, or a physical prediction.
That works for physical entropy. For the sense of entropy used in information theory, a better substitution would be uncertainty.
“The number of microstates for a given macrostate tends to increase over time”
Or, are microstate and macrostate also garblejargon?
“The number of microstates for a given macrostate tends to increase over time”
That’s not true. The number of microstates per macrostate is fixed.
You are right—my mistake.
An increase in entropy is a movement from a macrostate with a smaller number of microstates to a macrostate with a larger number of microstates.