There were two ways to interpret your statement—that uploaded won’t be identical human beings (an empirical statement) vs. uploads will disrupt your continuity (a philosophical statement).
I was just wondering which one it was. I’m interested in hearing arguments against uploading
-How do you know right now that you are a continuity of the being that existed one-hour-in-the-past, and that the being that exists one-hour-in-the-future will be in continuity with you?
-Would you ever step into a sci-fi style teleporter?
-cryonics constitutes “pausing” and “resuming” yourself. How is this sort of temporal discontinuity different from the spatial discontinuity involved in teleporting?
There were two ways to interpret your statement—that uploaded won’t be identical human beings (an empirical statement) vs. uploads will disrupt your continuity (a philosophical statement).
The latter, but they are both empirical questions. The former deals with comparing informational configurations at two points in time, whereas the latter is concerned with the history of how we went from state A to state B (both having real-world implications).
How do you know right now that you are a continuity of the being that existed one-hour-in-the-past, and that the being that exists one-hour-in-the-future will be in continuity with you?
We need more research on the physical basis for consciousness to understand this better such that we can properly answer the question. Right now all we have is the fleeting experience of continued identity moment to moment, and the induction principle which is invalid to apply over singular events like destructive uploading.
My guess as to the underlying nature of the problem is that consciousness exists in any complex interaction of particles—not the pattern itself, but the instantiation of the computation. And so long as this interaction is continuous and ongoing we have a physical basis for the continuation of subjective experience.
Would you ever step into a sci-fi style teleporter?
Never, for the same reasons.
Cryonics constitutes “pausing” and “resuming” yourself. How is this sort of temporal discontinuity different from the spatial discontinuity involved in teleporting?
Pausing is a metaphor. You can’t freeze time and chemistry never stops entirely. The particles in a cryonic patient’s brain keep interacting in complex, albeit much slowed down ways. Recall that the point of pumping the brain full of anti-freeze is that it remains intact and structurally unmolested even at liquid nitrogen temperatures. It is likely that some portion of biological activity is ongoing in cryostatasis albeit at a glacial pace. This may or may not be sufficient for continuity of experience, but unlike uploading the probability is at least not zero.
BTW the problem with teleporting is not spatial or temporal. The problem is that the computational process which is the subjective experience of the person being teleported is interrupted. The machine violently disassembles them and they die, then somewhere else a clone/copy is created. If you have trouble seeing that, imagine that the process is not destructive. You step into the teleporter, it scans you, and then you step out. I then shoot you in the head with a gun. The teleporter then reconstructs a copy of you. Do you really think that you, the person I just shot in the head and now is splattered all over the floor, gets to experience walking out of the teleporter as a copy? If you’re still having trouble, imagine that the teleporter got stuck in a loop and kept outputting copies. Which one is you? Which one do you expect to “wake up” as at the other end of the process?
You step into the teleporter, it scans you, and then you step out. I then shoot you in the head with a gun. The teleporter then reconstructs a copy of you. Do you really think that you, the person I just shot in the head and now is splattered all over the floor, gets to experience walking out of the teleporter as a copy? If you’re still having trouble, imagine that the teleporter got stuck in a loop and kept outputting copies. Which one is you? Which one do you expect to “wake up” as at the other end of the process?
My current thought on the matter is that Ishaan0 stepped into the elevator, Ishaan1a stepped out of the elevator, and Ishaan1b was replicated by the elevator.
At time 2, Ishaan2a was shot, and Ishaan2b survived.
Ishaan0 → ishaan1a --> ishaan2a just died.
Ishaan0 → ishaan1b--->ishaan2b--->ishaan3b --->… gets to live on.
So Ishaan0 can be said to have survived, whereas ishaan1a has died.
Right now all we have is the fleeting experience of continued identity moment to moment
The way I see it, my past self is “dead” in every respect other than that my current self exists and contains memories of that past self.
I don’t think there is anything fundamental saying we aught to be able to have “expectations” about our future subjective experiences, only “predictions” about the future.
Meaning, if ishaan0 had a blindfold on, then at time1 when I step out of the teleporter I would have memories which indicate that my current qualia qualify me to be in the position of either Ishaan1a or Ishaan1b. When I take my blindfold off, I find out which one I am.
The problem is that the computational process which is the subjective experience of the person being teleported is interrupted.
It sounds to me like you’re ascribing some critical, necessary aspect of consciousness to the ‘computation’ that occurs between states, as opposed to the presence of the states themselves.
It strikes me as similar to the ‘sampling fallacy’ of analog audio enthusiasts, who constantly claim that digitization of a recording is by definition lossy because a discrete stream can not contain all the data needed to reconstruct a continuous waveform.
It sounds to me like you’re ascribing some critical, necessary aspect of consciousness to the ‘computation’ that occurs between states, as opposed to the presence of the states themselves.
Absolutely (although I don’t see the connection to analog audio). Is a frozen brain conscious? No. It is the dynamic response of brain from which the subjective experience of consciousness arises.
The connection to analog audio seems obvious to me: a digitized audio file contains no music, it contains only discrete samples taken at various times, samples which when played out properly generate music. An upload file containing the recording of a digital brain contains no conciousness, but is concious when run, one cycle at a time.
A sample is a snapshot of an instant of music; an upload is a snapshot of conciousness. Playing out a large number of samples creates music; running an upload forward in time creates conciousness. In the same way that a frozen brain isn’t concious but an unfrozen, running brain is—an uploaded copy isn’t concious, but a running, uploaded copy is.
That’s the point I was trying to get across. The discussion of samples and states is important because you seem to have this need for transitions to be ‘continuous’ for conciousness to be preserved—but the sampling theorem explicitly says that’s not necessary. There’s no ‘continuous’ transition between two samples in a wave file, yet the original can still be reconstructed perfectly. There may not be a continous transition between a brain and its destructively uploaded copy—but the original and ‘continuous transition’ can still be reconstructed perfectly. It’s simple math.
As a direct result of this, it seems pretty obvious to me that conciousness doesn’t go away because there’s a time gap between states or because the states happen to be recorded on different media, any more than breaking a wave file into five thousand non-contiguous sectors on a hard disk platter destroys the music in the recording. Pretty much the only escape from this is to use a mangled definition of conciousness which requires ‘continuous transition’ for no obvious good reason.
I’m not saying it goes away, I’m saying the uploaded brain is a different person, a different being, a separate identity from the one that was scanned. It is conscious yes, but it is not me in the sense that if I walk into an uploader I expect to walk out again in my fleshy body. Maybe that scan is then used to start a simulation from which arises a fully conscious copy of me, but I don’t expect to directly experience what that copy experiences.
The uploaded brain is a different person, a different being, a separate identity from the one that was scanned. It is conscious yes, and it is me in the sense that I expect with high probability to wake up as an upload and watch my fleshy body walk out of the scanner under its own power.
Of course I wouldn’t expect the simulation to experience the exact same things as the meat version, or expect to experience both copies at the same time. Frankly, that’s an idiotic belief; I would prefer you not bring it into the conversation in the future, as it makes me feel like you’re intentionally trolling me. I may not believe what you believe, but even I’m not that stupid.
OK, I was just checking.
There were two ways to interpret your statement—that uploaded won’t be identical human beings (an empirical statement) vs. uploads will disrupt your continuity (a philosophical statement).
I was just wondering which one it was. I’m interested in hearing arguments against uploading
-How do you know right now that you are a continuity of the being that existed one-hour-in-the-past, and that the being that exists one-hour-in-the-future will be in continuity with you?
-Would you ever step into a sci-fi style teleporter?
-cryonics constitutes “pausing” and “resuming” yourself. How is this sort of temporal discontinuity different from the spatial discontinuity involved in teleporting?
The latter, but they are both empirical questions. The former deals with comparing informational configurations at two points in time, whereas the latter is concerned with the history of how we went from state A to state B (both having real-world implications).
We need more research on the physical basis for consciousness to understand this better such that we can properly answer the question. Right now all we have is the fleeting experience of continued identity moment to moment, and the induction principle which is invalid to apply over singular events like destructive uploading.
My guess as to the underlying nature of the problem is that consciousness exists in any complex interaction of particles—not the pattern itself, but the instantiation of the computation. And so long as this interaction is continuous and ongoing we have a physical basis for the continuation of subjective experience.
Never, for the same reasons.
Pausing is a metaphor. You can’t freeze time and chemistry never stops entirely. The particles in a cryonic patient’s brain keep interacting in complex, albeit much slowed down ways. Recall that the point of pumping the brain full of anti-freeze is that it remains intact and structurally unmolested even at liquid nitrogen temperatures. It is likely that some portion of biological activity is ongoing in cryostatasis albeit at a glacial pace. This may or may not be sufficient for continuity of experience, but unlike uploading the probability is at least not zero.
BTW the problem with teleporting is not spatial or temporal. The problem is that the computational process which is the subjective experience of the person being teleported is interrupted. The machine violently disassembles them and they die, then somewhere else a clone/copy is created. If you have trouble seeing that, imagine that the process is not destructive. You step into the teleporter, it scans you, and then you step out. I then shoot you in the head with a gun. The teleporter then reconstructs a copy of you. Do you really think that you, the person I just shot in the head and now is splattered all over the floor, gets to experience walking out of the teleporter as a copy? If you’re still having trouble, imagine that the teleporter got stuck in a loop and kept outputting copies. Which one is you? Which one do you expect to “wake up” as at the other end of the process?
My current thought on the matter is that Ishaan0 stepped into the elevator, Ishaan1a stepped out of the elevator, and Ishaan1b was replicated by the elevator.
At time 2, Ishaan2a was shot, and Ishaan2b survived.
Ishaan0 → ishaan1a --> ishaan2a just died.
Ishaan0 → ishaan1b--->ishaan2b--->ishaan3b --->… gets to live on.
So Ishaan0 can be said to have survived, whereas ishaan1a has died.
The way I see it, my past self is “dead” in every respect other than that my current self exists and contains memories of that past self.
I don’t think there is anything fundamental saying we aught to be able to have “expectations” about our future subjective experiences, only “predictions” about the future.
Meaning, if ishaan0 had a blindfold on, then at time1 when I step out of the teleporter I would have memories which indicate that my current qualia qualify me to be in the position of either Ishaan1a or Ishaan1b. When I take my blindfold off, I find out which one I am.
It sounds to me like you’re ascribing some critical, necessary aspect of consciousness to the ‘computation’ that occurs between states, as opposed to the presence of the states themselves.
It strikes me as similar to the ‘sampling fallacy’ of analog audio enthusiasts, who constantly claim that digitization of a recording is by definition lossy because a discrete stream can not contain all the data needed to reconstruct a continuous waveform.
Absolutely (although I don’t see the connection to analog audio). Is a frozen brain conscious? No. It is the dynamic response of brain from which the subjective experience of consciousness arises.
See a more physical explanation here.
The connection to analog audio seems obvious to me: a digitized audio file contains no music, it contains only discrete samples taken at various times, samples which when played out properly generate music. An upload file containing the recording of a digital brain contains no conciousness, but is concious when run, one cycle at a time.
A sample is a snapshot of an instant of music; an upload is a snapshot of conciousness. Playing out a large number of samples creates music; running an upload forward in time creates conciousness. In the same way that a frozen brain isn’t concious but an unfrozen, running brain is—an uploaded copy isn’t concious, but a running, uploaded copy is.
That’s the point I was trying to get across. The discussion of samples and states is important because you seem to have this need for transitions to be ‘continuous’ for conciousness to be preserved—but the sampling theorem explicitly says that’s not necessary. There’s no ‘continuous’ transition between two samples in a wave file, yet the original can still be reconstructed perfectly. There may not be a continous transition between a brain and its destructively uploaded copy—but the original and ‘continuous transition’ can still be reconstructed perfectly. It’s simple math.
As a direct result of this, it seems pretty obvious to me that conciousness doesn’t go away because there’s a time gap between states or because the states happen to be recorded on different media, any more than breaking a wave file into five thousand non-contiguous sectors on a hard disk platter destroys the music in the recording. Pretty much the only escape from this is to use a mangled definition of conciousness which requires ‘continuous transition’ for no obvious good reason.
I’m not saying it goes away, I’m saying the uploaded brain is a different person, a different being, a separate identity from the one that was scanned. It is conscious yes, but it is not me in the sense that if I walk into an uploader I expect to walk out again in my fleshy body. Maybe that scan is then used to start a simulation from which arises a fully conscious copy of me, but I don’t expect to directly experience what that copy experiences.
The uploaded brain is a different person, a different being, a separate identity from the one that was scanned. It is conscious yes, and it is me in the sense that I expect with high probability to wake up as an upload and watch my fleshy body walk out of the scanner under its own power.
Of course I wouldn’t expect the simulation to experience the exact same things as the meat version, or expect to experience both copies at the same time. Frankly, that’s an idiotic belief; I would prefer you not bring it into the conversation in the future, as it makes me feel like you’re intentionally trolling me. I may not believe what you believe, but even I’m not that stupid.