I was an inveterate thirder until I read a series of articles on repeated betting, which pointed out that in many cases, maximizing expected utility leads to a “heavy tailed” situation in which a few realizations of you have enormous utility, but most realizations of you have gone bankrupt. The mean utility across all realizations is large, but that’s useless in the vast majority of cases because there’s no way to transfer utility from one realization to another. This got me thinking about SB again, and the extent to which Beauties can or can not share or transfer utility between them. I eventually convinced myself of the halfer position.
Here’s the line of reasoning I used. If the coin comes up H, we have one awakening (experience A). If the coin comes up T, we have two awakenings—either in series or in parallel depending on the variant, but in any case indistinguishable. By Bayes, Pr(H|A) = Pr(A|H)Pr(H)/Pr(A). The core insight is that Pr(A|H) = Pr(A|T) = Pr(A) = 1, since you have experience A no matter what the coin flip says. SB is akin to drawing a ball from one of two jars, one of which contains one red ball, and the other of which contains two red balls. Having drawn a red ball, you learn nothing about which jar you drew from.
What about making bets, though? Say that SB is offered a chance to buy a ticket worth $1 if the coin was T, and $0 if it was H. To maintain indistinguishability between the “three Beauties,” each time she is awakened, she must be offered the same ticket. In this case, SB should be willing to pay up to $2/3 for such a ticket. But this is not because the probability of T is really 2⁄3 - it is because the payoff for T is larger since the bet is made twice in sequence. In the “clones” variant, SB’s valuation of the ticket depends on how she values the welfare of her clone-sister: if she is perfectly selfish she values it at $1/2, whereas if she is perfectly altruistic she values it at $2/3. Again, this is because of variations in the payout—obviously SB’s estimate of the probability of a coin flip cannot depend on whether she is selfish or not!
A lot of anthropic arguments depend on simply “counting up the observers” and using that as a proxy for probability. This is illegitimate, because conditional probabilities must always be normalized to sum to one. Pr(Monday|T) + Pr(Tuesday|T) = 1⁄2 + 1⁄2. Any time you use conditional probability you have to be very careful: Pr(Monday|T) != Pr(Monday and T).
I’m not a data scientist, but I love these. I’ve got a four-hour flight ahead of me and a copy of Microsoft Excel; maybe now is the right time to give one a try!