Nick Hay—IIRC the minus-log probability of an outcome is usually called “surprisal” or “self-information”. The Shannon entropy of a random variable is just the expected value of its surprisal.
Nick Hay—IIRC the minus-log probability of an outcome is usually called “surprisal” or “self-information”. The Shannon entropy of a random variable is just the expected value of its surprisal.