Bit-level reasoning suggests you should flip all bits, as each bit impacts the total result in 8⁄16 cases, and in seven of those cases the coin came up tails. 7/8*$28+1/8*$4=$25>$21.
Why are we comparing to $21, the pre-flip byte level value? The bit-level “choose 1” calculation is implicitly adding the contributions of each impactful bit times the probability the bit is impactful, or 7*(7/8*$4) + 1*(1/8*$4) = $25. Similarly let’s “choose 0″ as 7*(7/8*$X) + 1*(1/8*$21).
The only way we’re getting that calculation to equal $21, the byte-level calculation, is if $X is $3. As if each of the 7 bits is contributing 1⁄7 of the $21 reward. But how in the world can we justify setting $X to $3 in bit-level reasoning, taking into account the other bits, when we don’t take the other bits into account when calculating the probability?
I’m not sure I see what you’re suggesting instead, though I will point out I believe there is an error in that section- the point of this problem is to figure out what is wrong with the calculation suggesting you should defect.
I calculated the expected values for bytes with the bit-level analysis- that is, the bit-level analysis has done the “okay, you’re a decider” updating but is still dealing with numbers on the scale of bytes. So $21 comes from 1⁄8 * $21 + 7⁄8 * $21=$21.
The justification for that is, if you sent in a byte of identical bits, all bits know that they are clones and so whatever they decide, all the others will decide as well.
Why are we comparing to $21, the pre-flip byte level value? The bit-level “choose 1” calculation is implicitly adding the contributions of each impactful bit times the probability the bit is impactful, or 7*(7/8*$4) + 1*(1/8*$4) = $25. Similarly let’s “choose 0″ as 7*(7/8*$X) + 1*(1/8*$21).
The only way we’re getting that calculation to equal $21, the byte-level calculation, is if $X is $3. As if each of the 7 bits is contributing 1⁄7 of the $21 reward. But how in the world can we justify setting $X to $3 in bit-level reasoning, taking into account the other bits, when we don’t take the other bits into account when calculating the probability?
I’m not sure I see what you’re suggesting instead, though I will point out I believe there is an error in that section- the point of this problem is to figure out what is wrong with the calculation suggesting you should defect.
I calculated the expected values for bytes with the bit-level analysis- that is, the bit-level analysis has done the “okay, you’re a decider” updating but is still dealing with numbers on the scale of bytes. So $21 comes from 1⁄8 * $21 + 7⁄8 * $21=$21.
The justification for that is, if you sent in a byte of identical bits, all bits know that they are clones and so whatever they decide, all the others will decide as well.