I think we’re in agreement then, although I’ve managed to confuse myself by trying to actually do the Shannon entropy math.
In the event we don’t care about birth orders we have two relevant hypotheses which need to be distinguished between (boy-girl at 66% and boy-boy at 33%), so the message length would only need to be 0.9 bits#Definition) if I’m applying the math correctly for the entropy of a discrete random variable. So in one somewhat odd sense Sarah would actually know more about the gender than George does.
Which, given that the original post said
Still, it seems like Sarah knows more about the situation, where George, by being given more information, knows less. His estimate is as good as knowing nothing other than the fact that the man has a child which could be equally likely to be a boy or a girl.
Pragmatist is correct, I did not realize that the way I stated the problem was different than the original.
I full understand the solution to this problem.
However, lets look at the original problem. John only knows that one of the man’s children is a boy:
1) B, G | 0.33
2) G, B | 0.33
3) G, G | 0.00
4) B, B | 0.33
P(B)|(4) = 1 P(G)| (1,2) = 1
P(B)= .33 P(G) = .66
So lets say that now the woman tells John that the boy is also the eldest:
1) B, G | 0.5
2) G, B | 0.0
3) G, G | 0.0
4) B, B | 0.5
P(B)|(4) = 1 P(G)| (1) = 1 P(B)= .5 P(G) = .5
At first I saw a problem because John obviously knows more given the second piece of information, so the fact that his estimate is worse seemed really weird. What I think is going on here is that his learning more really does decrease his ability to predict the gender of the other child: Before, he had 3 options, 2 of which contained a girl-answer. Now, one of those 2 answers are taken away, so he currently has 2 options, 1 of which contains a girl-answer. As he becomes more informed about the total state of the world, his ability to predict this particular piece of information decreases.
I think we’re in agreement then, although I’ve managed to confuse myself by trying to actually do the Shannon entropy math.
In the event we don’t care about birth orders we have two relevant hypotheses which need to be distinguished between (boy-girl at 66% and boy-boy at 33%), so the message length would only need to be 0.9 bits#Definition) if I’m applying the math correctly for the entropy of a discrete random variable. So in one somewhat odd sense Sarah would actually know more about the gender than George does.
Which, given that the original post said
may not actually be implausible. Huh.
Pragmatist is correct, I did not realize that the way I stated the problem was different than the original.
I full understand the solution to this problem.
However, lets look at the original problem. John only knows that one of the man’s children is a boy:
1) B, G | 0.33
2) G, B | 0.33
3) G, G | 0.00
4) B, B | 0.33
P(B)|(4) = 1 P(G)| (1,2) = 1
P(B)= .33 P(G) = .66
So lets say that now the woman tells John that the boy is also the eldest:
1) B, G | 0.5
2) G, B | 0.0
3) G, G | 0.0
4) B, B | 0.5
P(B)|(4) = 1 P(G)| (1) = 1
P(B)= .5 P(G) = .5
At first I saw a problem because John obviously knows more given the second piece of information, so the fact that his estimate is worse seemed really weird. What I think is going on here is that his learning more really does decrease his ability to predict the gender of the other child: Before, he had 3 options, 2 of which contained a girl-answer. Now, one of those 2 answers are taken away, so he currently has 2 options, 1 of which contains a girl-answer. As he becomes more informed about the total state of the world, his ability to predict this particular piece of information decreases.
The fact that John predicts 0.5 while Sarah predicts 0.66 doesn’t mean that Sarah’s prediction is somehow better.