Going back to each of the finite cases, we can condition the finite case by the population that can support up to m rounds. iteration by the population size presupposes nothing about the game state and we can construct the Bayes Probability table for such games.
For a population M that supports at most m rounds of play the probability that a player will be in any given round n<=m is 2nM and the sum of the probabilities that a player is in round n from n=1→m is m∑n=12n−1M=2m−1M; We can let M=2m−1 because any additional population will not be sufficient to support an m+1 round until the population reaches 2m+1−1, which is just trading m for m+1 where we ultimately will take the limit anyhow.
The horizontal axis of the Bays Probability table now looks like this
[1M2M4M8M...2m−1M]
The vertical axis of the Bays Probability table we can independently look at the odds the game ends at round n for n<=m. This can be due to snake eyes or it can be due to reaching round m with out rolling snake eyes. For the rounds n=1→m where snake eyes were rolled the probability of the game ending on round n is p∗(1−p)n−1 and the probability that a reaches round m with out ever rolling snake eyes is (1−p)m . The sum of all of these possible end states in a game that has at most finite m rounds is (1−p)m+∑mn=1p(1−p)n−1 which equals =1
More over with the full Bayes Probability table we can find other conditional probabilities at infinity by taking the limit as the population grows to allow bigger and bigger maximum rounds of m.
I think that’s what makes this a paradox.
Going back to each of the finite cases, we can condition the finite case by the population that can support up to m rounds. iteration by the population size presupposes nothing about the game state and we can construct the Bayes Probability table for such games.
For a population M that supports at most m rounds of play the probability that a player will be in any given round n<=m is 2nM and the sum of the probabilities that a player is in round n from n=1→m is m∑n=12n−1M=2m−1M; We can let M=2m−1 because any additional population will not be sufficient to support an m+1 round until the population reaches 2m+1−1, which is just trading m for m+1 where we ultimately will take the limit anyhow.
The horizontal axis of the Bays Probability table now looks like this
[1M2M4M8M...2m−1M]
The vertical axis of the Bays Probability table we can independently look at the odds the game ends at round n for n<=m. This can be due to snake eyes or it can be due to reaching round m with out rolling snake eyes. For the rounds n=1→m where snake eyes were rolled the probability of the game ending on round n is p∗(1−p)n−1 and the probability that a reaches round m with out ever rolling snake eyes is (1−p)m . The sum of all of these possible end states in a game that has at most finite m rounds is (1−p)m+∑mn=1p(1−p)n−1 which equals =1
So we have m+1 rows for the horizontal axis
⎡⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢⎣pp(1−p)p(1−p)2p(1−p)3⋮p(1−p)m(1−p)m⎤⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥⎦
So the Bayes Probability Table starts to look like this in general.
⎡⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢⎣[1M2M4M8M...2m−1M]⎡⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢⎣pp(1−p)p(1−p)2p(1−p)3⋮p(1−p)m(1−p)m⎤⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥⎦⎡⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢ ⎢⎣pM2pM4pM8pM...p∗2m−1Mp(1−p)M2p(1−p)M4p(1−p)M8p(1−p)M...p(1−p)∗2m−1Mp(1−p)2M2p(1−p)2M4p(1−p)2M8p(1−p)2M...p(1−p)2∗2m−1Mp(1−p)3M2p(1−p)3M4p(1−p)3M8p(1−p)3M...p(1−p)3∗2m−1M⋮⋮⋮⋮⋱⋮p(1−p)m−1M2p(1−p)m−1M4p(1−p)m−1M8p(1−p)m−1M...p(1−p)m−1∗2m−1M(1−p)mM2(1−p)mM4(1−p)mM8(1−p)mM...(1−p)m∗2m−1M⎤⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥⎦⎤⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥ ⎥⎦
The total probability of losing is equal to the sum of the diagonal where i=j
pM+2p(1−p)M+4p(1−p)2M+8p(1−p)3M+...+p(1−p)m−1∗2m−1M=pMm∑i=1[(1−p)i−1∗2i−1]
Total probability of being chosen is the sum of the diagonal and all the cells below the diagonal.
m∑i=1[2i−1∗(1−p)mM]+m∑i=1i∑j=1[2j−1∗p(1−p)i−1M]
=(2m−1)(1−p)mM+pMm∑i=1[(2i−1)(1−p)i−1]
So the conditional probability of losing given that you have been selected is
pM∑mi=1[(1−p)i−1∗2i−1](2m−1)(1−p)mM+pM∑mi=1[(2i−1)(1−p)i−1]
=p∑mi=1[(1−p)i−1∗2i−1](2m−1)(1−p)m+p∑mi=1[(2i−1)(1−p)i−1]=p
More over with the full Bayes Probability table we can find other conditional probabilities at infinity by taking the limit as the population grows to allow bigger and bigger maximum rounds of m.
So the conditional probability of losing given that you have been selected from an infinite population
Such as the conditional probability of being chosen in the game that has no snake eyes given you were chosen at all.
Or the conditional probability of dying given that you were chosen in precisely round k
This seems like great work! If we’re allowing to run out of players, the whole paradox collapses.