In the post, How Much Evidence Does It Take, Eliezer described the concept of ‘bits’ of information. For example, if you wanted to choose winning lottery numbers with a higher probability, you could have a box that beeps for the correct lottery number with 100% probability and only beeps for an incorrect number with 25% probability. Then the application of this box would represent 2 bits of information—because it winnows your possible winning set by a factor of 4.
During the chat, we discussed this definition of “bits”. MrHen brought in some mathematics to discuss the case where the box beeps with less than 100% probability for the correct number (reduced box sensitivity, with possibly the same specificity), and how this would affect the calculation of bits.
An interesting piece of trivia came up. Measuring information “base 2” is arbitrary of course and instead of measuring bits we could measure “bels” or “bans” (base 10).
Wow, I wish I’d been there for that (had to go to a trade group meeting) -- that’s one of the topics that interests me!
Btw, I think you mean that a beep-for-incorrect gives you 2 bits of information. Just applying the box will usually (~75% of the time) not indicate either way. The average information gained from an application of the box (aka entropy of the box variable aka expected surprisal of using the box aka average information gain on using the box) would be ~0.5 bits.
So I ended up at the game in person. How did this go? Any insights to share with those of us who weren’t there?
This is a transcript of the chat log.
In the post, How Much Evidence Does It Take, Eliezer described the concept of ‘bits’ of information. For example, if you wanted to choose winning lottery numbers with a higher probability, you could have a box that beeps for the correct lottery number with 100% probability and only beeps for an incorrect number with 25% probability. Then the application of this box would represent 2 bits of information—because it winnows your possible winning set by a factor of 4.
During the chat, we discussed this definition of “bits”. MrHen brought in some mathematics to discuss the case where the box beeps with less than 100% probability for the correct number (reduced box sensitivity, with possibly the same specificity), and how this would affect the calculation of bits.
An interesting piece of trivia came up. Measuring information “base 2” is arbitrary of course and instead of measuring bits we could measure “bels” or “bans” (base 10).
Wow, I wish I’d been there for that (had to go to a trade group meeting) -- that’s one of the topics that interests me!
Btw, I think you mean that a beep-for-incorrect gives you 2 bits of information. Just applying the box will usually (~75% of the time) not indicate either way. The average information gained from an application of the box (aka entropy of the box variable aka expected surprisal of using the box aka average information gain on using the box) would be ~0.5 bits.
And yes there’s also nats (base e).
I believe the point was that a beep constitutes 2 bits of evidence for the hypothesis that the number is winning.