I wasn’t sure whether the calculations made between observations (the updating of probabilities) should count as “new information about the world” or not. From a strictly information theoretic point of view they don’t (since the calculations are entailed by the observations so far, there’s no reduction in Shannon entropy after making them). From a psychological point of view they do, since we learn as much—or more—from the updates as we do from the observations themselves.
To quote Esar ( :D )
I wasn’t sure whether the calculations made between observations (the updating of probabilities) should count as “new information about the world” or not. From a strictly information theoretic point of view they don’t (since the calculations are entailed by the observations so far, there’s no reduction in Shannon entropy after making them). From a psychological point of view they do, since we learn as much—or more—from the updates as we do from the observations themselves.