There’s no mechanism linking the two entities, so it seems necessary that whatever that each entity has a distinct first-person experience. Whoever “you” are, then, you can’t experience being both entities. I think that’s the cleanest way to express what I mean, and thank you for calling me on using “obvious.”
Another way of thinking about this: Suppose someone offers to make 1 million essentially perfect copies of you and subject them to the best life they can engineer for them, which you get to confirm fits perfectly with your own values. The catch: prior to the copy, they’ll paint a 1 on your forehead, which will not be copied. They’ll then find “you” and subject the “original” to endless torture. I, for one, would not hesitate to reject this offer for largely self-interested reasons. I can understand an altruist taking it, though that makes the fact that the million people are copies rather irrelevant. If I understand the stance of many people here (RH, for example), they’d take the deal out of self interest (at least some number of copies, which could be greater than a million), because they don’t distinguish between copies. This seems like severely flawed reasoning, though too complex to properly address in a sub-sub-comment. I’d like to know if this is a straw man.
In general, I find that continuity of consciousness is an illusion that’s hard-wired into us for self-preservational purposes. We can explain the mind without needing to define some sort of entity that remains the same from our birth to death, and any attempted definition for such an entity gets more and more convoluted as you try to consistently answer questions like “if you lose all your memories, is it still you”, “if you get disassembled and then rebuilt, is that still you” and “how can you at 5 be the same person as you at 50”. It’s a bit like believing in a soul.
Still, the concept of a ‘you’ has various e.g. legal and social uses, and it’s still a relatively well-defined concept for as long as you don’t try to consider various weird cases. Once we have a world where people can be copied, however, the folk-psychological concept of “you” pretty much becomes incoherent and arbitrary. Which still doesn’t force you to completely abandon the concept, of course—you can arbitrarily define it however you wish.
As for your thought experiment, there are at two interpretations that make sense to me. One is that since every copy will have experiences and memories identical to being me and there are a million of them, then there’s a 1⁄1,000,000 chance for me to “become” any particular one of the copies. Correspondingly, there’s a 1⁄1,000,000 chance that I’ll be tortured. The other interpretation is that there is a 100% chance that I will “become” each of the copies, so a 100% that I’ll become the one that is eternally tortured and a 100% chance that I’ll also become the 999,999 others.
Alternatively, you could also say that there’s a 100% chance that I’ll remain the one who had “1” painted on his forehead. Or that I’ll become all of the copies whose number happens to be a prime. Or whatever. Identity is arbitrary in such a scenario, so which one is “correct” depends pretty much only on your taste.
Why not?
There’s no mechanism linking the two entities, so it seems necessary that whatever that each entity has a distinct first-person experience. Whoever “you” are, then, you can’t experience being both entities. I think that’s the cleanest way to express what I mean, and thank you for calling me on using “obvious.”
Another way of thinking about this: Suppose someone offers to make 1 million essentially perfect copies of you and subject them to the best life they can engineer for them, which you get to confirm fits perfectly with your own values. The catch: prior to the copy, they’ll paint a 1 on your forehead, which will not be copied. They’ll then find “you” and subject the “original” to endless torture. I, for one, would not hesitate to reject this offer for largely self-interested reasons. I can understand an altruist taking it, though that makes the fact that the million people are copies rather irrelevant. If I understand the stance of many people here (RH, for example), they’d take the deal out of self interest (at least some number of copies, which could be greater than a million), because they don’t distinguish between copies. This seems like severely flawed reasoning, though too complex to properly address in a sub-sub-comment. I’d like to know if this is a straw man.
In general, I find that continuity of consciousness is an illusion that’s hard-wired into us for self-preservational purposes. We can explain the mind without needing to define some sort of entity that remains the same from our birth to death, and any attempted definition for such an entity gets more and more convoluted as you try to consistently answer questions like “if you lose all your memories, is it still you”, “if you get disassembled and then rebuilt, is that still you” and “how can you at 5 be the same person as you at 50”. It’s a bit like believing in a soul.
Still, the concept of a ‘you’ has various e.g. legal and social uses, and it’s still a relatively well-defined concept for as long as you don’t try to consider various weird cases. Once we have a world where people can be copied, however, the folk-psychological concept of “you” pretty much becomes incoherent and arbitrary. Which still doesn’t force you to completely abandon the concept, of course—you can arbitrarily define it however you wish.
As for your thought experiment, there are at two interpretations that make sense to me. One is that since every copy will have experiences and memories identical to being me and there are a million of them, then there’s a 1⁄1,000,000 chance for me to “become” any particular one of the copies. Correspondingly, there’s a 1⁄1,000,000 chance that I’ll be tortured. The other interpretation is that there is a 100% chance that I will “become” each of the copies, so a 100% that I’ll become the one that is eternally tortured and a 100% chance that I’ll also become the 999,999 others.
Alternatively, you could also say that there’s a 100% chance that I’ll remain the one who had “1” painted on his forehead. Or that I’ll become all of the copies whose number happens to be a prime. Or whatever. Identity is arbitrary in such a scenario, so which one is “correct” depends pretty much only on your taste.