A measurement is an observation that quantitatively reduces uncertainty.
A measurement reduces expected uncertainty. Some particular measurement results increase uncertainty. E.g. you start out by assigning 90% probability that a binary variable landed heads and then you see evidence with a likelihood ratio of 1:9 favoring tails, sending your posterior to 50-50. However the expectation of the entropy of your probability distribution after seeing the evidence, is always evaluated to be lower than its current value in advance of seeing the evidence.
Just FYI, I think Hubbard knows this and wrote “A measurement is an observation that quantitatively reduces uncertainty” because he was trying to simplify and avoid clunky sentences. E.g. on p. 146 he writes:
It is even possible for an additional sample to sometimes increase the size of the [confidence] interval… before the next sample makes it narrower again. But, on average, the increasing sample size will decrease the size of the [confidence] interval.
The conditional entropy will always be lower unless the evidence is independent of your hypothesis(in this case the conditional entropy will be equal to the prior entropy).
A measurement reduces expected uncertainty. Some particular measurement results increase uncertainty. E.g. you start out by assigning 90% probability that a binary variable landed heads and then you see evidence with a likelihood ratio of 1:9 favoring tails, sending your posterior to 50-50. However the expectation of the entropy of your probability distribution after seeing the evidence, is always evaluated to be lower than its current value in advance of seeing the evidence.
Just FYI, I think Hubbard knows this and wrote “A measurement is an observation that quantitatively reduces uncertainty” because he was trying to simplify and avoid clunky sentences. E.g. on p. 146 he writes:
I’m reminded also of Russell’s comment:
The technical term for this is conditional entropy.
The conditional entropy will always be lower unless the evidence is independent of your hypothesis(in this case the conditional entropy will be equal to the prior entropy).