What I meant is that some time after the cloning, the clones’ lives would become distinguishable. One of them would experience X, while the other would experience ~X. Then I would anticipate experiencing X with 50% probability.
If they live identical lives forever, then I can anticipate “being either clone” or as I would call it, “not being able to tell which clone I am”.
My first instinctive response is “be wary of theories of personal identity where your future depends on a coin flip”. You’re essentially saying “one of the clones believes that it is your current ‘I’ experiencing ‘X’, and it has a 50% chance of being wrong”. That seems off.
I think to be consistent, you have to anticipate experiencing both X and ~X with 100% probability. The problem is that the way anticipation works with probability depends implicitly on there only being one future self that things can happen to.
You’re essentially saying “one of the clones believes that it is your current ‘I’ experiencing ‘X’, and it has a 50% chance of being wrong”.
No, I’m not saying that.
I’m saying: first both clones believe “anticipate X with 50% probability”. Then one clone experiences X, and the other ~X. After that they know what they experienced, so of course one updates to believe “I experienced X with ~1 probability” and the other “I experienced ~X with ~1 probability”.
I think to be consistent, you have to anticipate experiencing both X and ~X with 100% probability.
I think we need to unpack “experiencing” here.
I anticipate there will be a future state of me, which has experienced X (= remembers experiencing X), with 50% probability.
If X takes nontrivial time, such that one can experience “X is going on now”, then I anticipate ever experiencing that with 50% probability.
I anticipate there will be a future state of me, which has experienced X (= remembers experiencing X), with 50% probability.
What I meant is that some time after the cloning, the clones’ lives would become distinguishable. One of them would experience X, while the other would experience ~X.
But that means there is always (100%) a future state of you that has experienced X, and a separate future state that has always (100%) experienced ~X. I think there’s some similarity here to the problem of probability in a many-worlds universe, except in this case both versions can still interact. I’m not sure how that affects things myself.
You’re right, there’s a contradiction in what I said. Here’s how to resolve it.
At time T=1 there is one of me, and I go to sleep.
While I sleep, a clone of me is made and placed in an identical room.
At T=2 both clones wake up.
At T=3 one clone experiences X. The other doesn’t (and knows that he doesn’t).
So, what should my expected probability for experiencing X be?
At T=3 I know for sure, so it goes to 1 for one clone and 0 for the other.
At T=2, the clones have woken up, but each doesn’t know which he is yet. Therefore each expects X with 50% probability.
At T=1, before going to sleep, there isn’t a single number that is the correct expectation. This isn’t because probability breaks down, but because the concept of “my future experience” breaks down in the presence of clones. Neither 50% nor 100% is right.
50% is wrong for the reason you point out. 100% is also wrong, because X and ~X are symmetrical. Assigning 100% to X means 0% to ~X.
So in the presence of expected future clones, we shouldn’t speak of “what I expect to experience” but “what I expect a clone of mine to experience”—or “all clones”, or “p proportion of clones”.
Suppose I’m ~100% confident that, while we sleep tonight, someone will paint a blue dot on either my forehead or my husband’s but not both. In that case, I am ~50% confident that I will see a blue dot, I am ~100% confident that one of us will see a blue dot, I am ~100% confident that one of us will not see a blue dot.
If someone said that seeing a blue dot and not-seeing a blue dot are symmetrical, so assigning ~100% confidence to “one of us will see a blue dot” means assigning ~0% to “one of us will not see a blue dot”, I would reply that they are deeply confused. The noun phrase “one of us” simply doesn’t behave that way.
In the scenario you describe, the noun phrase “I” doesn’t behave that way either.
I’m ~100% confident that I will experience X, and I’m ~100% confident that I will not experience X.
In your example, you anticipate your own experiences, but not your husband’s experiences. I don’t see how this is analogous to a case of cloning, where you equally anticipate both.
If someone said that seeing a blue dot and not-seeing a blue dot are symmetrical, so assigning ~100% confidence to “one of us will see a blue dot” means assigning ~0% to “one of us will not see a blue dot”, I would reply that they are deeply confused.
I’m not saying that “[exactly] one of us will see a blue dot” and “[neither] one of us will not see a blue dot” are symmetrical; that would be wrong. What I was saying was that “I will see a blue dot” and “I will not see a blue dot” are symmetrical.
I’m ~100% confident that I will experience X, and I’m ~100% confident that I will not experience X.
All the terminologies that have been proposed here—by me, and you, and FeepingCreature—are just disagreeing over names, not real-world predictions.
I think the quoted statement is at the very least misleading because it’s semantically different from other grammatically similar constructions. Normally you can’t say “I am ~1 confident that [Y] and also ~1 confident that [~Y]”. So “I” isn’t behaving like an ordinary object. That’s why I think it’s better to be explicit and not talk about “I expect” at all in the presence of clones.
My comment about “symmetrical” was intended to mean the same thing: that when I read the statement “expect X with 100% probability”, I normally parse it as equivalent to “expect ~X with 0% probability”, which would be wrong here. And X and ~X are symmetrical by construction in the sense that every person, at every point in time, should expect X and ~X with the same probability (whether you call it “both 50%” like I do, or “both 100%” like FeepingCreature prefers), until of course a person actually observes either X or ~X.
In your example, you anticipate your own experiences, but not your husband’s experiences. I don’t see how this is analogous to a case of cloning, where you equally anticipate both.
In my example, my husband and I are two people, anticipating the experience of two people. In your example, I am one person, anticipating the experience of two people. It seems to me that what my husband and I anticipate in my example is analogous to what I anticipate in your example.
But, regardless, I agree that we’re just disagreeing about names, and if you prefer the approach of not talking about “I expect” in such cases, that’s OK with me.
Shouldn’t you anticipate being either clone with 100% probability, since both clones will make that claim and neither can be considered wrong?
What I meant is that some time after the cloning, the clones’ lives would become distinguishable. One of them would experience X, while the other would experience ~X. Then I would anticipate experiencing X with 50% probability.
If they live identical lives forever, then I can anticipate “being either clone” or as I would call it, “not being able to tell which clone I am”.
My first instinctive response is “be wary of theories of personal identity where your future depends on a coin flip”. You’re essentially saying “one of the clones believes that it is your current ‘I’ experiencing ‘X’, and it has a 50% chance of being wrong”. That seems off.
I think to be consistent, you have to anticipate experiencing both X and ~X with 100% probability. The problem is that the way anticipation works with probability depends implicitly on there only being one future self that things can happen to.
No, I’m not saying that.
I’m saying: first both clones believe “anticipate X with 50% probability”. Then one clone experiences X, and the other ~X. After that they know what they experienced, so of course one updates to believe “I experienced X with ~1 probability” and the other “I experienced ~X with ~1 probability”.
I think we need to unpack “experiencing” here.
I anticipate there will be a future state of me, which has experienced X (= remembers experiencing X), with 50% probability.
If X takes nontrivial time, such that one can experience “X is going on now”, then I anticipate ever experiencing that with 50% probability.
But that means there is always (100%) a future state of you that has experienced X, and a separate future state that has always (100%) experienced ~X. I think there’s some similarity here to the problem of probability in a many-worlds universe, except in this case both versions can still interact. I’m not sure how that affects things myself.
You’re right, there’s a contradiction in what I said. Here’s how to resolve it.
At time T=1 there is one of me, and I go to sleep. While I sleep, a clone of me is made and placed in an identical room. At T=2 both clones wake up. At T=3 one clone experiences X. The other doesn’t (and knows that he doesn’t).
So, what should my expected probability for experiencing X be?
At T=3 I know for sure, so it goes to 1 for one clone and 0 for the other.
At T=2, the clones have woken up, but each doesn’t know which he is yet. Therefore each expects X with 50% probability.
At T=1, before going to sleep, there isn’t a single number that is the correct expectation. This isn’t because probability breaks down, but because the concept of “my future experience” breaks down in the presence of clones. Neither 50% nor 100% is right.
50% is wrong for the reason you point out. 100% is also wrong, because X and ~X are symmetrical. Assigning 100% to X means 0% to ~X.
So in the presence of expected future clones, we shouldn’t speak of “what I expect to experience” but “what I expect a clone of mine to experience”—or “all clones”, or “p proportion of clones”.
Suppose I’m ~100% confident that, while we sleep tonight, someone will paint a blue dot on either my forehead or my husband’s but not both. In that case, I am ~50% confident that I will see a blue dot, I am ~100% confident that one of us will see a blue dot, I am ~100% confident that one of us will not see a blue dot.
If someone said that seeing a blue dot and not-seeing a blue dot are symmetrical, so assigning ~100% confidence to “one of us will see a blue dot” means assigning ~0% to “one of us will not see a blue dot”, I would reply that they are deeply confused. The noun phrase “one of us” simply doesn’t behave that way.
In the scenario you describe, the noun phrase “I” doesn’t behave that way either.
I’m ~100% confident that I will experience X, and I’m ~100% confident that I will not experience X.
I really find that subscripts help here.
In your example, you anticipate your own experiences, but not your husband’s experiences. I don’t see how this is analogous to a case of cloning, where you equally anticipate both.
I’m not saying that “[exactly] one of us will see a blue dot” and “[neither] one of us will not see a blue dot” are symmetrical; that would be wrong. What I was saying was that “I will see a blue dot” and “I will not see a blue dot” are symmetrical.
All the terminologies that have been proposed here—by me, and you, and FeepingCreature—are just disagreeing over names, not real-world predictions.
I think the quoted statement is at the very least misleading because it’s semantically different from other grammatically similar constructions. Normally you can’t say “I am ~1 confident that [Y] and also ~1 confident that [~Y]”. So “I” isn’t behaving like an ordinary object. That’s why I think it’s better to be explicit and not talk about “I expect” at all in the presence of clones.
My comment about “symmetrical” was intended to mean the same thing: that when I read the statement “expect X with 100% probability”, I normally parse it as equivalent to “expect ~X with 0% probability”, which would be wrong here. And X and ~X are symmetrical by construction in the sense that every person, at every point in time, should expect X and ~X with the same probability (whether you call it “both 50%” like I do, or “both 100%” like FeepingCreature prefers), until of course a person actually observes either X or ~X.
In my example, my husband and I are two people, anticipating the experience of two people. In your example, I am one person, anticipating the experience of two people. It seems to me that what my husband and I anticipate in my example is analogous to what I anticipate in your example.
But, regardless, I agree that we’re just disagreeing about names, and if you prefer the approach of not talking about “I expect” in such cases, that’s OK with me.