I was an inveterate thirder until I read a series of articles on repeated betting, which pointed out that in many cases, maximizing expected utility leads to a “heavy tailed” situation in which a few realizations of you have enormous utility, but most realizations of you have gone bankrupt. The mean utility across all realizations is large, but that’s useless in the vast majority of cases because there’s no way to transfer utility from one realization to another. This got me thinking about SB again, and the extent to which Beauties can or can not share or transfer utility between them. I eventually convinced myself of the halfer position.
Here’s the line of reasoning I used. If the coin comes up H, we have one awakening (experience A). If the coin comes up T, we have two awakenings—either in series or in parallel depending on the variant, but in any case indistinguishable. By Bayes, Pr(H|A) = Pr(A|H)Pr(H)/Pr(A). The core insight is that Pr(A|H) = Pr(A|T) = Pr(A) = 1, since you have experience A no matter what the coin flip says. SB is akin to drawing a ball from one of two jars, one of which contains one red ball, and the other of which contains two red balls. Having drawn a red ball, you learn nothing about which jar you drew from.
What about making bets, though? Say that SB is offered a chance to buy a ticket worth $1 if the coin was T, and $0 if it was H. To maintain indistinguishability between the “three Beauties,” each time she is awakened, she must be offered the same ticket. In this case, SB should be willing to pay up to $2/3 for such a ticket. But this is not because the probability of T is really 2⁄3 - it is because the payoff for T is larger since the bet is made twice in sequence. In the “clones” variant, SB’s valuation of the ticket depends on how she values the welfare of her clone-sister: if she is perfectly selfish she values it at $1/2, whereas if she is perfectly altruistic she values it at $2/3. Again, this is because of variations in the payout—obviously SB’s estimate of the probability of a coin flip cannot depend on whether she is selfish or not!
A lot of anthropic arguments depend on simply “counting up the observers” and using that as a proxy for probability. This is illegitimate, because conditional probabilities must always be normalized to sum to one. Pr(Monday|T) + Pr(Tuesday|T) = 1⁄2 + 1⁄2. Any time you use conditional probability you have to be very careful: Pr(Monday|T) != Pr(Monday and T).
I try to avoid any discussion of repeated betting, because of the issues you raise. Doing so addresses the unorthodox part of an unorthodox problem, and so can be used to get either solution you prefer.
But that unorthodox part is unnecessary. In my comment to pathos_bot, I pointed out that there are significant differences between the problem as Elga posed it, and the problem as it is used in the controversy. It the posed problem, the probability question is asked before you are put to sleep, and there is no Monday/Tuesday schedule. In his solution, Elga never asked the question upon waking, and he used the Monday/Tuesday schedule to implement the problem but inadvertently created the unorthodox part.
There is a better implementation, that avoids the unorthodox part.
Before being put to sleep, you are told that two coins will be flipped after you are put to sleep, C1 and C2. And that, at any moment during the experiment, we want to know the degree to which you believe that coin C1 came up Heads. Then, if either coin is showing Tails (but not if both are showing Heads):
You will be wakened.
Remember what we wanted to know? Tell us your degree of belief.
You will be put back to sleep with amnesia.
Once this is either skipped or completed, coin C2 is turned over to show its other side. And the process is repeated.
This implements Elga’s problem exactly, and adds less to it than he did. But now you can consider just what has happened between looking at the coins to see if either is showing Tails, and now. When examined, there were four equiprobable combinations of the two coins: HH, HT, TH, and TT. Since you are awake, HH is eliminated. Of the three combinations that remain, C1 landed on Heads in only one.
Were you always a thirder? Or is this two coin version of Sleeping Beauty what changed your mind to become one? Would you change your mind if the two coin case was found to be flawed?
I skipped answering the initial question because I’ve always been a thirder. I’m just trying to comment on the reasons people have given. Mostly how many will try to use fuzzy logic—like “isn’t the question just asking about the coin flip?” in order to make the answer that they find intuitive sound more reasonable. I find that people will tend to either not change their answer because they don’t want to back down from their intuition, or oscillate back and forth, without recalling why they picked an answer a few weeks later. Many of those will end up with “it depends on what you think the question is.”
Suppose we have the same two coin setting but instead of steps 1, 2, 3 a ball is put into the box.
Then, after the procedure is done and there are either one or two balls in the box, you are to be given random balls from it as long as there any. You’ve just gotten a random ball. Should you, by the same logic, assume that probability to get a second ball is 2/3?
You’ll need to describe that better. If you replace (implied by “instead”) step 1, you are never wakened. If you add “2.1 Put a ball into the box” and “2.2 Remove balls from the box. one by one, until there are no more” then there are never two balls in the box.
I mean that there are no sleeping or awakenings, instead there are balls in a box that follow the same logic:
Two coins are tossed, if both are Heads, nothing happens, otherwise a ball is put into a box. Then the second coin is placed the other side and once again, the ball is placed into the box unless both coins are Heads. Then you are randomly given a ball from the box.
Should you reason that there is another ball in a box with probability 2/3? After all, there are four equiprobable combinations: HH, TT, HT, TH. Since the ball, you were given, was put into the box, it couldn’t happen when the outcome was HH, so we are left with HT, TH and TT.
This variation of my two-coin is just converting my version of the problem Elga posed back into the one Elga solved. And if you leave out the amnesia step (you didn’t say), it is doing so incorrectly.
The entire point of the two-coin version was that it eliminated the obfuscating details that Elga added. So why put them back?
So please, before I address this attempt at diversion in more detail, address mine.
Do you think my version accurately implements the problem as posed?
Do you think my solution, yielding the unambiguous answer 1⁄3, is correct? If not, why not?
Your Two Coin Toss version is isomorphic to classical Sleeping Beauty problem with everything this entails.
The problem Elga solved in his paper isn’t actually Sleeping Beauty problem—more on it in my next post.
Likewise, the solution you propose to your Two Coin Toss problem is actually solving a different problem:
Two coins are tossed if the outcome is HH you are not awakened, on every other outcome you are awakened. You are awakened. What is the probability that the first coin came Heads?
Here your reasoning is correct. There are four equiprobable possible outcomes and awakening illiminates one of them. Person who participates in the experiment couldn’t be certain to experience an awakening and that’s why it is evidence in favor of Tails. 1⁄3 is unambiguously correct answer.
But in Two Coin Toss version of Sleeping Beauty this logic doesn’t apply. It would proove too much. And to see why it’s the case, you may investigate my example with balls being put in the box, instead of awakenings and memory erasure.
My problem setup is an exact implementation of the problem Elga asked. Elga’s adds some detail that does not affect the answer, but has created more than two decades of controversy.
I was an inveterate thirder until I read a series of articles on repeated betting, which pointed out that in many cases, maximizing expected utility leads to a “heavy tailed” situation in which a few realizations of you have enormous utility, but most realizations of you have gone bankrupt. The mean utility across all realizations is large, but that’s useless in the vast majority of cases because there’s no way to transfer utility from one realization to another. This got me thinking about SB again, and the extent to which Beauties can or can not share or transfer utility between them. I eventually convinced myself of the halfer position.
Here’s the line of reasoning I used. If the coin comes up H, we have one awakening (experience A). If the coin comes up T, we have two awakenings—either in series or in parallel depending on the variant, but in any case indistinguishable. By Bayes, Pr(H|A) = Pr(A|H)Pr(H)/Pr(A). The core insight is that Pr(A|H) = Pr(A|T) = Pr(A) = 1, since you have experience A no matter what the coin flip says. SB is akin to drawing a ball from one of two jars, one of which contains one red ball, and the other of which contains two red balls. Having drawn a red ball, you learn nothing about which jar you drew from.
What about making bets, though? Say that SB is offered a chance to buy a ticket worth $1 if the coin was T, and $0 if it was H. To maintain indistinguishability between the “three Beauties,” each time she is awakened, she must be offered the same ticket. In this case, SB should be willing to pay up to $2/3 for such a ticket. But this is not because the probability of T is really 2⁄3 - it is because the payoff for T is larger since the bet is made twice in sequence. In the “clones” variant, SB’s valuation of the ticket depends on how she values the welfare of her clone-sister: if she is perfectly selfish she values it at $1/2, whereas if she is perfectly altruistic she values it at $2/3. Again, this is because of variations in the payout—obviously SB’s estimate of the probability of a coin flip cannot depend on whether she is selfish or not!
A lot of anthropic arguments depend on simply “counting up the observers” and using that as a proxy for probability. This is illegitimate, because conditional probabilities must always be normalized to sum to one. Pr(Monday|T) + Pr(Tuesday|T) = 1⁄2 + 1⁄2. Any time you use conditional probability you have to be very careful: Pr(Monday|T) != Pr(Monday and T).
I try to avoid any discussion of repeated betting, because of the issues you raise. Doing so addresses the unorthodox part of an unorthodox problem, and so can be used to get either solution you prefer.
But that unorthodox part is unnecessary. In my comment to pathos_bot, I pointed out that there are significant differences between the problem as Elga posed it, and the problem as it is used in the controversy. It the posed problem, the probability question is asked before you are put to sleep, and there is no Monday/Tuesday schedule. In his solution, Elga never asked the question upon waking, and he used the Monday/Tuesday schedule to implement the problem but inadvertently created the unorthodox part.
There is a better implementation, that avoids the unorthodox part.
Before being put to sleep, you are told that two coins will be flipped after you are put to sleep, C1 and C2. And that, at any moment during the experiment, we want to know the degree to which you believe that coin C1 came up Heads. Then, if either coin is showing Tails (but not if both are showing Heads):
You will be wakened.
Remember what we wanted to know? Tell us your degree of belief.
You will be put back to sleep with amnesia.
Once this is either skipped or completed, coin C2 is turned over to show its other side. And the process is repeated.
This implements Elga’s problem exactly, and adds less to it than he did. But now you can consider just what has happened between looking at the coins to see if either is showing Tails, and now. When examined, there were four equiprobable combinations of the two coins: HH, HT, TH, and TT. Since you are awake, HH is eliminated. Of the three combinations that remain, C1 landed on Heads in only one.
So, could you answer the initial question?
Were you always a thirder? Or is this two coin version of Sleeping Beauty what changed your mind to become one? Would you change your mind if the two coin case was found to be flawed?
I skipped answering the initial question because I’ve always been a thirder. I’m just trying to comment on the reasons people have given. Mostly how many will try to use fuzzy logic—like “isn’t the question just asking about the coin flip?” in order to make the answer that they find intuitive sound more reasonable. I find that people will tend to either not change their answer because they don’t want to back down from their intuition, or oscillate back and forth, without recalling why they picked an answer a few weeks later. Many of those will end up with “it depends on what you think the question is.”
Suppose we have the same two coin setting but instead of steps 1, 2, 3 a ball is put into the box.
Then, after the procedure is done and there are either one or two balls in the box, you are to be given random balls from it as long as there any. You’ve just gotten a random ball. Should you, by the same logic, assume that probability to get a second ball is 2/3?
You’ll need to describe that better. If you replace (implied by “instead”) step 1, you are never wakened. If you add “2.1 Put a ball into the box” and “2.2 Remove balls from the box. one by one, until there are no more” then there are never two balls in the box.
I mean that there are no sleeping or awakenings, instead there are balls in a box that follow the same logic:
Two coins are tossed, if both are Heads, nothing happens, otherwise a ball is put into a box. Then the second coin is placed the other side and once again, the ball is placed into the box unless both coins are Heads. Then you are randomly given a ball from the box.
Should you reason that there is another ball in a box with probability 2/3? After all, there are four equiprobable combinations: HH, TT, HT, TH. Since the ball, you were given, was put into the box, it couldn’t happen when the outcome was HH, so we are left with HT, TH and TT.
This variation of my two-coin is just converting my version of the problem Elga posed back into the one Elga solved. And if you leave out the amnesia step (you didn’t say), it is doing so incorrectly.
The entire point of the two-coin version was that it eliminated the obfuscating details that Elga added. So why put them back?
So please, before I address this attempt at diversion in more detail, address mine.
Do you think my version accurately implements the problem as posed?
Do you think my solution, yielding the unambiguous answer 1⁄3, is correct? If not, why not?
Your Two Coin Toss version is isomorphic to classical Sleeping Beauty problem with everything this entails.
The problem Elga solved in his paper isn’t actually Sleeping Beauty problem—more on it in my next post.
Likewise, the solution you propose to your Two Coin Toss problem is actually solving a different problem:
Here your reasoning is correct. There are four equiprobable possible outcomes and awakening illiminates one of them. Person who participates in the experiment couldn’t be certain to experience an awakening and that’s why it is evidence in favor of Tails. 1⁄3 is unambiguously correct answer.
But in Two Coin Toss version of Sleeping Beauty this logic doesn’t apply. It would proove too much. And to see why it’s the case, you may investigate my example with balls being put in the box, instead of awakenings and memory erasure.
My problem setup is an exact implementation of the problem Elga asked. Elga’s adds some detail that does not affect the answer, but has created more than two decades of controversy.
The answer of 1⁄3.