Does the information theory definition of entropy actually correspond to the physics definition of entropy? I understand what entropy means in terms of physics, but the information theory definition of the terms seemed fundamentally different to me. Is it, or does one actually correspond to the other in some way that I’m not seeing?
Shannon’s definition of entropy corresponds very closely to the definition of entropy used in statistical mechanics. It’s slightly more general and devoid of “physics baggage” (macro states and so on).
Analogy: Ising model of spin glasses vs undirected graphical models (Markov random fields). The former has a lot of baggage like “magnetization, external field, energy.” The latter is just a statistical model of conditional independence on a graph. The Ising model is a special case (in fact the first developed case, back in 1910) of a Markov random field.
Does the information theory definition of entropy actually correspond to the physics definition of entropy? I understand what entropy means in terms of physics, but the information theory definition of the terms seemed fundamentally different to me. Is it, or does one actually correspond to the other in some way that I’m not seeing?
Shannon’s definition of entropy corresponds very closely to the definition of entropy used in statistical mechanics. It’s slightly more general and devoid of “physics baggage” (macro states and so on).
Analogy: Ising model of spin glasses vs undirected graphical models (Markov random fields). The former has a lot of baggage like “magnetization, external field, energy.” The latter is just a statistical model of conditional independence on a graph. The Ising model is a special case (in fact the first developed case, back in 1910) of a Markov random field.
Physicists have a really good nose for models.