I suggest a lot of caution in thinking about how entropy appears in thermodynamics and information theory. All of statistical mechanics is based on the concept of energy, which has no analogue in information theory. Some people would suggest that for this reason the two quantities should not be called by the same term.
the “temperature” isn’t a uniform speed of all the molecules, it’s an average speed of the molecules, which in turn corresponds to a predictable statistical distribution of speeds
I assume you know this, but some readers may not: temperature is not actually equivalent to energy/speed, but rather to the derivative of entropy with respect to energy:
1/T = dS/dE
This is why we observe temperature equilibriation: two systems in thermal contact trade energy to maximize the net entropy of the ensemble. Thus in equilibrium a small shift in energy from one system to the other must not change the ensemble energy ==> the temperature of the systems must be equal.
In almost all real systems, temperature and energy are monotonically related, so you won’t go too far astray by thinking of temperature as energy. However, in theory one can imagine systems that are forced into a smaller number of states as their energies increase (dS/dE < 0) and so in fact have negative temperature:
I suggest a lot of caution in thinking about how entropy appears in thermodynamics and information theory. All of statistical mechanics is based on the concept of energy, which has no analogue in information theory. Some people would suggest that for this reason the two quantities should not be called by the same term.
the “temperature” isn’t a uniform speed of all the molecules, it’s an average speed of the molecules, which in turn corresponds to a predictable statistical distribution of speeds
I assume you know this, but some readers may not: temperature is not actually equivalent to energy/speed, but rather to the derivative of entropy with respect to energy:
1/T = dS/dE
This is why we observe temperature equilibriation: two systems in thermal contact trade energy to maximize the net entropy of the ensemble. Thus in equilibrium a small shift in energy from one system to the other must not change the ensemble energy ==> the temperature of the systems must be equal.
In almost all real systems, temperature and energy are monotonically related, so you won’t go too far astray by thinking of temperature as energy. However, in theory one can imagine systems that are forced into a smaller number of states as their energies increase (dS/dE < 0) and so in fact have negative temperature:
http://en.wikipedia.org/wiki/Negative_temperature