What I like about this story is that it makes more accessible the (to me) obvious fact that, in the absence of technology to synchronize/reintegrate memories from parallel instances, uploading does not solve any problems for you—it at best spawns a new instance of you that doesn’t have those problems, but you still do.
Yet uploading is so much easier than fixing death/illness/scarcity in the physical world that people want to believe it’s the holy grail. And may resist evidence to the contrary.
Wait, why are destructive uploads murder/suicide? A copy of you ceases to exist and then another copy comes into existence with the exact same sense of memories/continuity of self etc. That’s like going to sleep and waking up. Non-destructive uploads are plausibly like murder/suicide, but you don’t need to do down that route.
A copy of you ceases to exist and then another copy comes into existence with the exact same sense of memories/continuity of self etc. That’s like going to sleep and waking up.
Even when it becomes possible to do this at sufficient resolution, I see no reason it won’t be like going to sleep and never waking up.
It’s not as if there is a soul to transfer or share between the two instances. No way to sync the experiences of the two instances.
So I don’t see a fundamental difference between “You go to sleep and an uploaded you wakes up” vs “You go to sleep and an uploaded somebody else wakes up”. In either case it will be a life in which I am not a participant and experiences I will not be able to access.
Non-destructive uploads could be benign, provided they are not used as an excuse for not improving the lives of the original instances.
Consider the following thought experiment: You discover that you’ve just been placed into a simulation, and that every night at midnight you are copied and deleted instantaneously, and in the next instant your copy is created where the original once was. Existentially terrified, you go on an alcohol and sugary treat binge, not caring about the next day. After all, it’s your copy who has to suffer the consequences, right? Eventually you fall asleep.
The next day you wake up hungover as all hell. After a few hours of recuperation, you consider what has happened. This feels just like waking up hungover before you were put into the simulation. You confirm that the copy and deletion did occur. It is confirmed. Are you still the same person you were before?
You’re right that it’s like going to sleep and never waking up, but Algon was also right about it being like going to sleep and waking up in the morning, because from the perspective of “original” you those are both the same experience.
Your instance is the pattern, and the pattern is moved to the computer.
Since consciousness is numerically identical to the pattern (or, more precisely, the pattern being processed), the question of how to get my consciousness in the computer after the pattern is already there doesn’t make sense. The consciousness is already there, because the consciousness is the pattern, and the pattern is already there.
I note a distributional shift issue, in that the concept of a single, continuous you only exists due to limitations of biology, and once digital uploads can happen, the concept of personality can get very weird indeed. The real question can be, does it matter then? Well, that’s a question that won’t be solved by philosophers.
So the real thing that is a lesson is be wary of distributional shift mucking up your consciousness.
I’m also biting the bullet and saying that this is probably what we should aim for, barring pivotal acts because I see AGI development as mostly inevitable, and there are far worse outcomes than this.
I’m also biting the bullet and saying that this is probably what we should aim for, barring pivotal acts because I see AGI development as mostly inevitable, and there are far worse outcomes than this.
Dead is dead, whether due to AGI or due to a sufficient percentage of smart people convincing themselves that destructive uploading is good enough and continuity is a philosophical question that doesn’t matter.
What I like about this story is that it makes more accessible the (to me) obvious fact that, in the absence of technology to synchronize/reintegrate memories from parallel instances, uploading does not solve any problems for you—it at best spawns a new instance of you that doesn’t have those problems, but you still do.
Yet uploading is so much easier than fixing death/illness/scarcity in the physical world that people want to believe it’s the holy grail. And may resist evidence to the contrary.
Destructive uploads are murder and/or suicide.
Wait, why are destructive uploads murder/suicide? A copy of you ceases to exist and then another copy comes into existence with the exact same sense of memories/continuity of self etc. That’s like going to sleep and waking up. Non-destructive uploads are plausibly like murder/suicide, but you don’t need to do down that route.
Even when it becomes possible to do this at sufficient resolution, I see no reason it won’t be like going to sleep and never waking up.
It’s not as if there is a soul to transfer or share between the two instances. No way to sync the experiences of the two instances.
So I don’t see a fundamental difference between “You go to sleep and an uploaded you wakes up” vs “You go to sleep and an uploaded somebody else wakes up”. In either case it will be a life in which I am not a participant and experiences I will not be able to access.
Non-destructive uploads could be benign, provided they are not used as an excuse for not improving the lives of the original instances.
Consider the following thought experiment: You discover that you’ve just been placed into a simulation, and that every night at midnight you are copied and deleted instantaneously, and in the next instant your copy is created where the original once was. Existentially terrified, you go on an alcohol and sugary treat binge, not caring about the next day. After all, it’s your copy who has to suffer the consequences, right? Eventually you fall asleep.
The next day you wake up hungover as all hell. After a few hours of recuperation, you consider what has happened. This feels just like waking up hungover before you were put into the simulation. You confirm that the copy and deletion did occur. It is confirmed. Are you still the same person you were before?
You’re right that it’s like going to sleep and never waking up, but Algon was also right about it being like going to sleep and waking up in the morning, because from the perspective of “original” you those are both the same experience.
Your instance is the pattern, and the pattern is moved to the computer.
Since consciousness is numerically identical to the pattern (or, more precisely, the pattern being processed), the question of how to get my consciousness in the computer after the pattern is already there doesn’t make sense. The consciousness is already there, because the consciousness is the pattern, and the pattern is already there.
Now, if synchronizing minds is possible, it would address this problem.
But I don’t see nearly as much attention being put into that as into uploading. Why?
I note a distributional shift issue, in that the concept of a single, continuous you only exists due to limitations of biology, and once digital uploads can happen, the concept of personality can get very weird indeed. The real question can be, does it matter then? Well, that’s a question that won’t be solved by philosophers.
So the real thing that is a lesson is be wary of distributional shift mucking up your consciousness.
I’m also biting the bullet and saying that this is probably what we should aim for, barring pivotal acts because I see AGI development as mostly inevitable, and there are far worse outcomes than this.
Dead is dead, whether due to AGI or due to a sufficient percentage of smart people convincing themselves that destructive uploading is good enough and continuity is a philosophical question that doesn’t matter.