I think the merit of Shannon was not to define entropy, but to understand the operational meaning of entropy in terms of coding a message with a minimal number of letters, leading to the notion of the capacity of a channel of communication, of error-correcting code and of “bit”.
Von Neumann’s entropy was introduced before Shannon’s entropy (1927, although the only reference I know is von Neumann’s book from 1932). It was also von Neumann how suggested the name “entropy” for the quantity that Shannon found. What Shannon could’ve noticed was that von Neumann’s entropy also has an operational meaning. But for that, he would’ve had to be interested in the transmission of quantum information by quantum channels, ideas that were not around at the time.
I think the merit of Shannon was not to define entropy, but to understand the operational meaning of entropy in terms of coding a message with a minimal number of letters, leading to the notion of the capacity of a channel of communication, of error-correcting code and of “bit”.
Von Neumann’s entropy was introduced before Shannon’s entropy (1927, although the only reference I know is von Neumann’s book from 1932). It was also von Neumann how suggested the name “entropy” for the quantity that Shannon found. What Shannon could’ve noticed was that von Neumann’s entropy also has an operational meaning. But for that, he would’ve had to be interested in the transmission of quantum information by quantum channels, ideas that were not around at the time.