I was recently arguing in /r/transhumanism on reddit about the viability of uploading/forking consciousness, and I realized I didn’t have any method of assessing where someone’s beliefs actually lay—where I might need to move them from if I wanted to convince them of what I thought.
So I made an intuition ladder. Please correct me if I made any mistakes (that aren’t by design), and let me know if you think there’s anything past the final level.
Some instructions on how to use this: Read the first level. If you notice something definitely wrong with it, move to the next level. Repeat until you come to a level where your intuition about the entire level is either “This is true” or “I’m not sure.” That is your level.
1. Clones and copies (the result of a medical procedure that physically reproduces you exactly, including internal brain state) are the same thing. Every intuition I have about a clone, or an identical twin, applies one-to-one to copies as well, and vice versa. Because identical twins are completely different people on every level except genetically, copies are exactly the same way.
2. Clones and copies aren’t the same thing, as copies had a brain and memories in common with me in the past, but for one of us those memories are false and that copy is just a copy, while my consciousness would remain with the privileged original.
3. Copies had a common brain and memories, which make them indistinguishable from each other in principle, so they believe they’re me, and they’re not wrong in any meaningful sense, but I don’t anticipate waking up from any copying procedure in any body but the one I started in. As such, I would never participate in a procedure that claims to “teleport” me by making a copy at a new location and killing the source copy, because I would die.
4. Copies are indistinguishable from each other in principle, even from the inside, and thus I actually become both, and anticipate waking up as either. But once I am one or the other, my copy doesn’t share an identity with me. Furthermore, if a copy is destroyed before I wake up from the procedure, I might die, or I might wake up as the copy that is still alive. As such, the fork-and-die teleport is a gamble for my life, and I would only attempt it if I was for some reason comfortable with the chance that I will die.
5. If a copy is destroyed during the procedure, I will wake up as the other one with near certainty, but this is a particular discrete consequence of how soon it’s done. If one copy were to die shortly after, I wouldn’t be less likely to wake up as that one or anything. I am therefore willing to fork-and-die teleport as long as the procedure is flawless. Furthermore, if I was instead backed up and copied from the backup at a later date, I would certainly wake up immediately after the procedure, and not anticipate waking up subjectively-immediately as the backup copy in the future.
6. I anticipate with less likelihood waking up as a copy that will die soon after the procedure—or for some other reason has a lower amplitude according to the Born rule—as a continuous function, and also it’s entirely irrelevant when the copy is instantiated in my anticipation of what I experience, as long as the copy has the mind state I did when the procedure was done. However, consciousness can only transfer to copies made of me. I can never wake up as an identical mind state somewhere in the universe if it wasn’t a result of copying, if such a thing were to exist, even in principle.
7. Continuity of consciousness is completely an artifact of mind state, including memory, and need not strictly require adjacency in spacetime at all. If, by some complete miraculous coincidence, in a galaxy far far away, a person exists at some time t’ that is exactly identical to me at some time in my life t, in a way a copy made of me at t would be, at the moment t, I anticipate my consciousness transferring to that far away not-copy with some probability. The only reason this doesn’t happen is the sheer unlikelihood of an exact mind state being duplicated, memories and all, by happenstance, anywhere in spacetime, even given the age of the universe from beginning to end. However, my consciousness can only be implemented on a human brain, or something that precisely mimics its internal structure.
8. Copies of me need not be or even resemble a human being. I am just an algorithm, and the hardware I am implemented on is irrelevant. If it’s done on a microchip or a human brain, any implementation of me is me. However, simulations aren’t truly real, so an implementation of me in a simulated world, no matter how advanced, isn’t actually me or conscious to the extent I am in the reality I know.
9. Implementations of me can exist within simulations that are sufficiently advanced to implement me fully. If a superintelligence who is able to perfectly model human minds is using that ability to consider what I would do, their model of me is me. Indeed, the only way to model me perfectly is to implement me.
Every Implementation of You is You: An Intuition Ladder
I was recently arguing in /r/transhumanism on reddit about the viability of uploading/forking consciousness, and I realized I didn’t have any method of assessing where someone’s beliefs actually lay—where I might need to move them from if I wanted to convince them of what I thought.
So I made an intuition ladder. Please correct me if I made any mistakes (that aren’t by design), and let me know if you think there’s anything past the final level.
Some instructions on how to use this: Read the first level. If you notice something definitely wrong with it, move to the next level. Repeat until you come to a level where your intuition about the entire level is either “This is true” or “I’m not sure.” That is your level.
1. Clones and copies (the result of a medical procedure that physically reproduces you exactly, including internal brain state) are the same thing. Every intuition I have about a clone, or an identical twin, applies one-to-one to copies as well, and vice versa. Because identical twins are completely different people on every level except genetically, copies are exactly the same way.
2. Clones and copies aren’t the same thing, as copies had a brain and memories in common with me in the past, but for one of us those memories are false and that copy is just a copy, while my consciousness would remain with the privileged original.
3. Copies had a common brain and memories, which make them indistinguishable from each other in principle, so they believe they’re me, and they’re not wrong in any meaningful sense, but I don’t anticipate waking up from any copying procedure in any body but the one I started in. As such, I would never participate in a procedure that claims to “teleport” me by making a copy at a new location and killing the source copy, because I would die.
4. Copies are indistinguishable from each other in principle, even from the inside, and thus I actually become both, and anticipate waking up as either. But once I am one or the other, my copy doesn’t share an identity with me. Furthermore, if a copy is destroyed before I wake up from the procedure, I might die, or I might wake up as the copy that is still alive. As such, the fork-and-die teleport is a gamble for my life, and I would only attempt it if I was for some reason comfortable with the chance that I will die.
5. If a copy is destroyed during the procedure, I will wake up as the other one with near certainty, but this is a particular discrete consequence of how soon it’s done. If one copy were to die shortly after, I wouldn’t be less likely to wake up as that one or anything. I am therefore willing to fork-and-die teleport as long as the procedure is flawless. Furthermore, if I was instead backed up and copied from the backup at a later date, I would certainly wake up immediately after the procedure, and not anticipate waking up subjectively-immediately as the backup copy in the future.
6. I anticipate with less likelihood waking up as a copy that will die soon after the procedure—or for some other reason has a lower amplitude according to the Born rule—as a continuous function, and also it’s entirely irrelevant when the copy is instantiated in my anticipation of what I experience, as long as the copy has the mind state I did when the procedure was done. However, consciousness can only transfer to copies made of me. I can never wake up as an identical mind state somewhere in the universe if it wasn’t a result of copying, if such a thing were to exist, even in principle.
7. Continuity of consciousness is completely an artifact of mind state, including memory, and need not strictly require adjacency in spacetime at all. If, by some complete miraculous coincidence, in a galaxy far far away, a person exists at some time t’ that is exactly identical to me at some time in my life t, in a way a copy made of me at t would be, at the moment t, I anticipate my consciousness transferring to that far away not-copy with some probability. The only reason this doesn’t happen is the sheer unlikelihood of an exact mind state being duplicated, memories and all, by happenstance, anywhere in spacetime, even given the age of the universe from beginning to end. However, my consciousness can only be implemented on a human brain, or something that precisely mimics its internal structure.
8. Copies of me need not be or even resemble a human being. I am just an algorithm, and the hardware I am implemented on is irrelevant. If it’s done on a microchip or a human brain, any implementation of me is me. However, simulations aren’t truly real, so an implementation of me in a simulated world, no matter how advanced, isn’t actually me or conscious to the extent I am in the reality I know.
9. Implementations of me can exist within simulations that are sufficiently advanced to implement me fully. If a superintelligence who is able to perfectly model human minds is using that ability to consider what I would do, their model of me is me. Indeed, the only way to model me perfectly is to implement me.
10. In progress, see Dacyn’s comment below.