I will bite the first horn of the trilemma. I’m will argue that the increase in subjective probability results from losing information and that it is no different from other situations where you lose information in such a way to make subjective probabilities seem higher. For example, if you watch the lotto draw, but then forget every number except those that match your ticket, your subjective probability that you won will be much higher than originally.
Let’s imagine that if you win the lottery that a billion copies of you will be created.
t=0: The lottery is drawn
t=1: If you won the lottery, then a billion clones are created. The original remembers that they are the original as they see the clones being created, but if clones were made, they don’t know they are clones and don’t know that the original knows that they are the original, so they can’t figure it out that way.
t=2: You have a bad memory and so you forget whether you are an original or a clone.
t=3: If any clones exist, they are all killed off
t=4: Everyone is informed about whether or not they ran the lottery.
Let’s suppose that you know you are the original and that you are at t=1. Your chances of winning the lottery are still 1 in a million as the creation of clones does not affect your probability of waking up to a win at t=4 if you know that you are not a clone.
Now let’s consider the probability at t=2. Your subjective odds of winning the lottery have rise massively, since you most probably are a copy. Even though there is only a one in a million chance that copies will be made, the fact that a billion copies will be made more than cancels this out.
What we have identified is that it is the information loss that is the key feature. Of course you can increase your subjective probabilities by erasing any information that is contrary. What is interesting about cloning is that if we are able to create clones with the exact same information, we are able to effectively remove knowledge without touching your brain. That is, if you know that you are not a clone, after we have cloned you exactly, then you no longer know you are not a clone, unless someone tells you or you see it happen.
Now at t=3 we kill off/merge any remaining clones. If you are still alive, you’ve gained information when you learned that you weren’t killed off. In fact, you’ve been retaught the same information you’ve forgotten.
So, I (as casebash) originally wanted to bite the first horn of the dilemma, then I wanted to bite different horns depending on assumptions.
Suppose we embrace physical continuity. Then all we’ve done by create copies is manage the news. Your original is just as likely to be in any state, but you now no longer know if you are the original or a clone. But even putting this aside, the ability to merge copies seems to contradict the idea of additional copies being given extra weight. If we embrace this idea, it seems more accurate to say that we can only add or delete copies.
On the other hand suppose we reject it. What would psychological continuity mean? Imagine a physical agent A and a clone C which undergo identical experiences over 10 seconds. Now we use A and C as our psychologically continuous agents, but there’s no reason why we couldn’t construct an psychologically continuous agent that is A during the even numbered seconds and C during the odd numbered seconds. In fact, there are a ridiculous number of overlapping, psychologically continuous agents, all of which ought to be equally valid. This seems absurd, so I’d go so far as to suggest that if we reject physical continuity, we ought to reject the notion of continuity altogether. This would led us to take the third horn.
I will bite the first horn of the trilemma. I’m will argue that the increase in subjective probability results from losing information and that it is no different from other situations where you lose information in such a way to make subjective probabilities seem higher. For example, if you watch the lotto draw, but then forget every number except those that match your ticket, your subjective probability that you won will be much higher than originally.
Let’s imagine that if you win the lottery that a billion copies of you will be created.
t=0: The lottery is drawn t=1: If you won the lottery, then a billion clones are created. The original remembers that they are the original as they see the clones being created, but if clones were made, they don’t know they are clones and don’t know that the original knows that they are the original, so they can’t figure it out that way. t=2: You have a bad memory and so you forget whether you are an original or a clone. t=3: If any clones exist, they are all killed off t=4: Everyone is informed about whether or not they ran the lottery.
Let’s suppose that you know you are the original and that you are at t=1. Your chances of winning the lottery are still 1 in a million as the creation of clones does not affect your probability of waking up to a win at t=4 if you know that you are not a clone.
Now let’s consider the probability at t=2. Your subjective odds of winning the lottery have rise massively, since you most probably are a copy. Even though there is only a one in a million chance that copies will be made, the fact that a billion copies will be made more than cancels this out.
What we have identified is that it is the information loss that is the key feature. Of course you can increase your subjective probabilities by erasing any information that is contrary. What is interesting about cloning is that if we are able to create clones with the exact same information, we are able to effectively remove knowledge without touching your brain. That is, if you know that you are not a clone, after we have cloned you exactly, then you no longer know you are not a clone, unless someone tells you or you see it happen.
Now at t=3 we kill off/merge any remaining clones. If you are still alive, you’ve gained information when you learned that you weren’t killed off. In fact, you’ve been retaught the same information you’ve forgotten.
So, I (as casebash) originally wanted to bite the first horn of the dilemma, then I wanted to bite different horns depending on assumptions.
Suppose we embrace physical continuity. Then all we’ve done by create copies is manage the news. Your original is just as likely to be in any state, but you now no longer know if you are the original or a clone. But even putting this aside, the ability to merge copies seems to contradict the idea of additional copies being given extra weight. If we embrace this idea, it seems more accurate to say that we can only add or delete copies.
On the other hand suppose we reject it. What would psychological continuity mean? Imagine a physical agent A and a clone C which undergo identical experiences over 10 seconds. Now we use A and C as our psychologically continuous agents, but there’s no reason why we couldn’t construct an psychologically continuous agent that is A during the even numbered seconds and C during the odd numbered seconds. In fact, there are a ridiculous number of overlapping, psychologically continuous agents, all of which ought to be equally valid. This seems absurd, so I’d go so far as to suggest that if we reject physical continuity, we ought to reject the notion of continuity altogether. This would led us to take the third horn.