The odds of winning the lottery are ordinarily a billion to one. But now the branch in which you win has your “measure”, your “amount of experience”, temporarily multiplied by a trillion.
This seems perhaps too obvious, but how can branches multiply probability by anything greater than 1? Conditional branches follow the rules of conjunctive probability . . .
Probability in regards to the future is simply a matter of counting branches. The subset of branches in which you win is always only one in a billion of all branches—and any further events in a branch only create further sub-branches, so the probability of anything happening in that sub-branch can never be greater than 10^-9. The exact number of copies in this context is irrelevant—it could be infinite and it wouldn’t matter.
Whether we accept identification with only one copy of ourself as in jschculter’s result or we consider our ‘self’ to be all copies, the results still work out to 1 billion to 1 against winning.
Another way of looking at the matter: we should be wary of any non-objective decision process. If we substitute ‘you’ for ‘person X’ in the example, we wouldn’t worry that person X splitting themselves into a trillion sub-copies only if they win the lottery would somehow increase their actual likelihood of winning.
This seems perhaps too obvious, but how can branches multiply probability by anything greater than 1? Conditional branches follow the rules of conjunctive probability . . .
Probability in regards to the future is simply a matter of counting branches. The subset of branches in which you win is always only one in a billion of all branches—and any further events in a branch only create further sub-branches, so the probability of anything happening in that sub-branch can never be greater than 10^-9. The exact number of copies in this context is irrelevant—it could be infinite and it wouldn’t matter.
Whether we accept identification with only one copy of ourself as in jschculter’s result or we consider our ‘self’ to be all copies, the results still work out to 1 billion to 1 against winning.
Another way of looking at the matter: we should be wary of any non-objective decision process. If we substitute ‘you’ for ‘person X’ in the example, we wouldn’t worry that person X splitting themselves into a trillion sub-copies only if they win the lottery would somehow increase their actual likelihood of winning.