Even if you, personally, happen to die, you’ve still got a copy of yourself in backup that some future generation will hopefully be able to reconstruct.
Is there a consensus on the whole brain backup identity issue?
I can’t say that trying to come up with intuition pumps about life extension has made me less confused about consciousness, but it does seem fairly obvious to me that if I’m backing up my brain, I’m just creating a second version who shares my values and capacities, not actually extending the life of version A. Being able to have both versions alive at the same time seems a clear indicator that they’re not the same, and that when source A dies, copy B just goes on with their life and doesn’t suddenly become A.
Unfortunately, I’m not sure the same argument doesn’t apply to one brain at different points in time, too. If you atomize my brain now and put it back together later, am I still A or is A dead? What about koma, sleep, or any other interruption of consciousness?
The idea of a persistent personal identity has no physical basis. I am not questioning consciousness only saying that the mental construct that there is an ownership to some particular sequence of conscious feelings over time is inconsistent with reality (as I would argue all the teleporter-type thought experiments show). So in my view all that matters is how much a certain entity X decides (or instinctually feels) it should care about some similar seeming later entity Y.
Is there a consensus on the whole brain backup identity issue?
No, and thank you for pointing out the potential for confusion in this post. I have edited some key wording: “results in the continuation of the perception of consciousness.” has now been changed to “results in a perception of consciousness functionally indistinguishable to an outside observer,” which much more closely reflects my intent.
So in other words, if John Doe went into a locked room, created a copy of himself, incinerated the original version, disposed of all the ashes, and then walked out of the room, the copy would be indistinguishable from the original John Doe from your perspective as an outside observer.
How John Doe himself perceives that interaction is an extremely difficult question to answer (or even to really formulate scientifically).
Is there a consensus on the whole brain backup identity issue?
NO.
There are many like me who see what the OP advocates as a gigantic holocaust. “Murder the entire population of the world and replace them with artificial copies” is a terrifying outcome.
Is there a consensus on the whole brain backup identity issue?
I can’t say that trying to come up with intuition pumps about life extension has made me less confused about consciousness, but it does seem fairly obvious to me that if I’m backing up my brain, I’m just creating a second version who shares my values and capacities, not actually extending the life of version A. Being able to have both versions alive at the same time seems a clear indicator that they’re not the same, and that when source A dies, copy B just goes on with their life and doesn’t suddenly become A.
Unfortunately, I’m not sure the same argument doesn’t apply to one brain at different points in time, too. If you atomize my brain now and put it back together later, am I still A or is A dead? What about koma, sleep, or any other interruption of consciousness?
It’s all kind of a blur to me.
The idea of a persistent personal identity has no physical basis. I am not questioning consciousness only saying that the mental construct that there is an ownership to some particular sequence of conscious feelings over time is inconsistent with reality (as I would argue all the teleporter-type thought experiments show). So in my view all that matters is how much a certain entity X decides (or instinctually feels) it should care about some similar seeming later entity Y.
No, and thank you for pointing out the potential for confusion in this post. I have edited some key wording: “results in the continuation of the perception of consciousness.” has now been changed to “results in a perception of consciousness functionally indistinguishable to an outside observer,” which much more closely reflects my intent.
So in other words, if John Doe went into a locked room, created a copy of himself, incinerated the original version, disposed of all the ashes, and then walked out of the room, the copy would be indistinguishable from the original John Doe from your perspective as an outside observer.
How John Doe himself perceives that interaction is an extremely difficult question to answer (or even to really formulate scientifically).
But that does not make it any less relevant a question.
“Outside observers” can be very different. You probably need to define that observer a bit better.
NO.
There are many like me who see what the OP advocates as a gigantic holocaust. “Murder the entire population of the world and replace them with artificial copies” is a terrifying outcome.