I have a long list of randomly-chosen numbers between 1 and 10, and I want to know whether their sum is even or odd.
I find your example here somewhat misleading. Suppose your random numbers weren’t randomly drawn from 1-10, but from {2,4,6,8,10}∪{1}. If you don’t know a single number, you still know that there is a 5:1 chance that it will be even (and hence not change the parity of the sum of the whole list). So if a single number is unknown, you will still want to take the sum of the ones you do know. In this light, your example seems like an exception, rather than the norm. (My main issue with it is that since it feels very ad-hoc, you might subconsciously come to the impression that the described behaviour is the norm.)
However, it might easily be that the class of these “exception” is important on its own. So I wouldn’t want to shoot down the overall idea described in the post—I like it :-).
This is basically correct if only a single number is unknown. But note that, as the amount of unknown numbers increases, the odds ratio for the sum being even quickly decays toward 1:1. If the odds are n:1 with a single unknown number, then ~n unknown numbers should put us close to 1:1 (and we should approach 1:1 asymptotically at a rate which scales inversely with n).
That’s the more realistic version of the thought-experiment: we have N inputs, and any single input unknown would leave us with at-worst n:1 odds on guessing the outcome. As long as N >> n, and a nontrivial fraction of the inputs are unknown, the signal is wiped out.
I find your example here somewhat misleading. Suppose your random numbers weren’t randomly drawn from 1-10, but from {2,4,6,8,10}∪{1}. If you don’t know a single number, you still know that there is a 5:1 chance that it will be even (and hence not change the parity of the sum of the whole list). So if a single number is unknown, you will still want to take the sum of the ones you do know. In this light, your example seems like an exception, rather than the norm. (My main issue with it is that since it feels very ad-hoc, you might subconsciously come to the impression that the described behaviour is the norm.)
However, it might easily be that the class of these “exception” is important on its own. So I wouldn’t want to shoot down the overall idea described in the post—I like it :-).
This is basically correct if only a single number is unknown. But note that, as the amount of unknown numbers increases, the odds ratio for the sum being even quickly decays toward 1:1. If the odds are n:1 with a single unknown number, then ~n unknown numbers should put us close to 1:1 (and we should approach 1:1 asymptotically at a rate which scales inversely with n).
That’s the more realistic version of the thought-experiment: we have N inputs, and any single input unknown would leave us with at-worst n:1 odds on guessing the outcome. As long as N >> n, and a nontrivial fraction of the inputs are unknown, the signal is wiped out.