[ Wikipedia quantifies the entropy as “average expected uncertainty” on a random variable with a distribution of possible values of arbitrary cardinality and assumes the underlying probability distribution is fixed, while Soares pins the cardinality of the distribution of possible values to 2 and assumes the probability distribution is a free variable. ]
Both these definitions are ever known as “Shannon entropy”.
Kolmogorov, however, would have classed Soares’s definition as, instead, a combinatorial definition of information, with the stuff about probability and entropy being parasemantic,
and dispensed entirely with Wikipedia’s notion of probabilistic “entropy”:
[ crossposted from my blog ]
Kolmogorov doesn’t think we need “entropy”.
Set physical “entropy” aside for the moment.
Here are two mutually incompatible definitions of information-theoretic “entropy”.
Wikipedia:
Soares:
[ Wikipedia quantifies the entropy as “average expected uncertainty” on a random variable with a distribution of possible values of arbitrary cardinality and assumes the underlying probability distribution is fixed, while Soares pins the cardinality of the distribution of possible values to 2 and assumes the probability distribution is a free variable. ]
Both these definitions are ever known as “Shannon entropy”.
Kolmogorov, however, would have classed Soares’s definition as, instead, a combinatorial definition of information, with the stuff about probability and entropy being parasemantic,
and dispensed entirely with Wikipedia’s notion of probabilistic “entropy”: