I understand what you are saying, but I am not convinced that there is a big difference.
Entropy measures the uncertainty in the distribution of the parameters. It measures something about our information about the system.
How would you change this uncertainty without disturbing the system?
But energy (relative to ground state) doesn’t change no matter how much information you gain about a system’s internal microstates.
How would you gain this information without disturbing the system (and hence changing its energy)?
EDIT: see also my reply to spxtr.
You have to define what ‘disturbing the system’ means. This is just the classical Maxwell’s demon question, and you can most definitely change this uncertainty without changing the thermodynamics of the system. Look at http://en.wikipedia.org/wiki/Maxwell%27s_demon#Criticism_and_development
Especially, the paragraph about Landauer’s work is relevant (and the cited Scientific American article is also interesting).
I understand what you are saying, but I am not convinced that there is a big difference.
How would you change this uncertainty without disturbing the system?
How would you gain this information without disturbing the system (and hence changing its energy)?
EDIT: see also my reply to spxtr.
You have to define what ‘disturbing the system’ means. This is just the classical Maxwell’s demon question, and you can most definitely change this uncertainty without changing the thermodynamics of the system. Look at http://en.wikipedia.org/wiki/Maxwell%27s_demon#Criticism_and_development
Especially, the paragraph about Landauer’s work is relevant (and the cited Scientific American article is also interesting).