“Disorder” isn’t very clear or helpful, really. Still, a physicist or chemist is likely to be able to give an account of entropy that is coherent and not controversial among scientists. On the other hand, most attempts to explain entropy by non-scientists would probably satisfy the fifth criterion. But that there is any class of people who seem to be able to use the word in a meaningful way seems to distinguish “entropy” from “emergence.”
Not as a definition. But many explanations which use “entropy” could also use “disorder” without becoming overtly incoherent or contradicting accounts given by most others; which was the requirement of #5. Of those explanations which use “entropy” in a more technical sense, many could go with my second example; and the rest could use something more specific, like an information-theoretic epression, or a physical prediction.
But many explanations which use “entropy” could also use “disorder” without becoming overtly incoherent or contradicting accounts given by most others; which was the requirement of #5.
That works for physical entropy. For the sense of entropy used in information theory, a better substitution would be uncertainty.
“Disorder” isn’t very clear or helpful, really. Still, a physicist or chemist is likely to be able to give an account of entropy that is coherent and not controversial among scientists. On the other hand, most attempts to explain entropy by non-scientists would probably satisfy the fifth criterion. But that there is any class of people who seem to be able to use the word in a meaningful way seems to distinguish “entropy” from “emergence.”
Not as a definition. But many explanations which use “entropy” could also use “disorder” without becoming overtly incoherent or contradicting accounts given by most others; which was the requirement of #5. Of those explanations which use “entropy” in a more technical sense, many could go with my second example; and the rest could use something more specific, like an information-theoretic epression, or a physical prediction.
That works for physical entropy. For the sense of entropy used in information theory, a better substitution would be uncertainty.