If the procedure is to flip a coin, clone me on tails but not on heads, separate the original and the clone if needed, then let me (or us) wake up, then when I wake up I will think the probability that I am the original is 3⁄4 and that the probability the coin landed on heads is 1⁄2. If I am then informed that I am the original, I will think that the probability the coin landed on heads is 2⁄3.
But that can’t be right. For this experiment, the Original and the Clone do not have to be waked up at the same time. The mad scientist could wake up the Original first. In fact, the coin can be tossed after it. For dramatic effect, after telling you you are the original, the mad scientist can give you the coin and let you toss it. It seems absurd to say the probability for Heads is anything but 1⁄2. Why is does the probability for Heads remains unchanged after learning you are the Original?
I don’t understand the objection. Yes, I say 1⁄2, and yes, anything but that seems absurd. This coinflip isn’t correlated with whether I was cloned, why should its probability depend on whether I am the original or the clone? In the first situation I believe the pre-experiment coinflip has 50% probability of having landed heads, then after learning some information positively correlated with the actual result being heads, I update to 67% probability of the coin having landed heads. In the dramatic situation I believe the post-experiment coinflip has 50% chance of heads and never learn any information correlated with the result. Zero contradiction here.
(That also means, of course, that when I say I choose 3⁄4 and 1⁄2 and then 2⁄3, I am smuggling in information; implicitly assuming the reward structure for “getting the right answer”. If I’d rather be right all the time if I’m the original and don’t care at all if I’m the clone then I can answer “probability 1.0 that I’m the original!” and make out like a bandit. In that sense, yes, all probabilities are meaningless, not just self-locating ones, until you know what decisions are being made based on them.)
The experiment can take the following steps: 1. Sleep, 2.Scan, 3.Wake up the Orignal, 4. The original tosses the coin, 5. If tails create the Clone. 6. Wake up the Clone so he has indiscernible experience as the Orignal in 3. This whole process is disclosed to you.
Now after waking up the coin may or may not have been tossed, what is the probability of Heads? And what is the probability that I am the Orignal (i.e. the coin has yet to be tossed)?
Probability is the measure of your uncertainty.
If the procedure is to flip a coin, clone me on tails but not on heads, separate the original and the clone if needed, then let me (or us) wake up, then when I wake up I will think the probability that I am the original is 3⁄4 and that the probability the coin landed on heads is 1⁄2. If I am then informed that I am the original, I will think that the probability the coin landed on heads is 2⁄3.
I don’t understand the objection. Yes, I say 1⁄2, and yes, anything but that seems absurd. This coinflip isn’t correlated with whether I was cloned, why should its probability depend on whether I am the original or the clone? In the first situation I believe the pre-experiment coinflip has 50% probability of having landed heads, then after learning some information positively correlated with the actual result being heads, I update to 67% probability of the coin having landed heads. In the dramatic situation I believe the post-experiment coinflip has 50% chance of heads and never learn any information correlated with the result. Zero contradiction here.
I think the comments on https://www.lesswrong.com/posts/YyJ8roBHhD3BhdXWn/probability-is-a-model-frequency-is-an-observation-why-both are pretty good, btw. They really showcase how all the hand-waving goes away as soon as you specify the decisions the original/clone will be making based on their degree of belief that they’re the original.
(That also means, of course, that when I say I choose 3⁄4 and 1⁄2 and then 2⁄3, I am smuggling in information; implicitly assuming the reward structure for “getting the right answer”. If I’d rather be right all the time if I’m the original and don’t care at all if I’m the clone then I can answer “probability 1.0 that I’m the original!” and make out like a bandit. In that sense, yes, all probabilities are meaningless, not just self-locating ones, until you know what decisions are being made based on them.)
The experiment can take the following steps: 1. Sleep, 2.Scan, 3.Wake up the Orignal, 4. The original tosses the coin, 5. If tails create the Clone. 6. Wake up the Clone so he has indiscernible experience as the Orignal in 3. This whole process is disclosed to you.
Now after waking up the coin may or may not have been tossed, what is the probability of Heads? And what is the probability that I am the Orignal (i.e. the coin has yet to be tossed)?