Just to be clear: I’m not talking about qualia. Don’t mistake what I’m saying for qualia or silent observers or such. I’m talking three broad classes of physical problems with uploads:
Real but as yet undiscovered biochemical or biophysical phenomena whose disruption you would notice but other people would not. In other words, the fidelity of an upload sufficient to convince other people will likely be lower than the fidelity sufficient to convince the original. So if the original must get disassembled in order to create the upload, this is automatically a red flag.
Perhaps an atom-perfect copy of you is you. But I don’t see how it follows that an atom-perfect simulation of you must also be you. But it’s a bold assertion that anything close to that level of accuracy will even be feasible in the first place, especially on a species-wide scale.
Even if we ignore substrate and assume that an atom-perfect simulation of you really is you at the instant it’s created, your paths diverge from that point forward: it becomes a happy sparkly transcendent immortal being and you are still the poor bastard trapped inside a meat-puppet doomed to die from cancer if heart disease doesn’t get you first. You can spawn off as many uploads as you want, with the exact same result. Nor does it help if it’s a destructive upload—it just changes the length of the delay from when you spawn your last upload and when you die from several decades to 0 seconds.
The above is not a dualism. It’s the opposite view that smacks of sympathetic magic: that your copy is somehow linked to you even after the copying process is complete and that because it’s immortal you no longer have to worry about dying yourself.
Now, if uploading is accompanied by some sort of continuous brain-sync technology, that may be a different matter.
Just to be clear: I’m not talking about qualia. Don’t mistake what I’m saying for qualia or silent observers or such. I’m talking three broad classes of physical problems with uploads:
Real but as yet undiscovered biochemical or biophysical phenomena whose disruption you would notice but other people would not. In other words, the fidelity of an upload sufficient to convince other people will likely be lower than the fidelity sufficient to convince the original. So if the original must get disassembled in order to create the upload, this is automatically a red flag.
Perhaps an atom-perfect copy of you is you. But I don’t see how it follows that an atom-perfect simulation of you must also be you. But it’s a bold assertion that anything close to that level of accuracy will even be feasible in the first place, especially on a species-wide scale.
Even if we ignore substrate and assume that an atom-perfect simulation of you really is you at the instant it’s created, your paths diverge from that point forward: it becomes a happy sparkly transcendent immortal being and you are still the poor bastard trapped inside a meat-puppet doomed to die from cancer if heart disease doesn’t get you first. You can spawn off as many uploads as you want, with the exact same result. Nor does it help if it’s a destructive upload—it just changes the length of the delay from when you spawn your last upload and when you die from several decades to 0 seconds.
The above is not a dualism. It’s the opposite view that smacks of sympathetic magic: that your copy is somehow linked to you even after the copying process is complete and that because it’s immortal you no longer have to worry about dying yourself.
Now, if uploading is accompanied by some sort of continuous brain-sync technology, that may be a different matter.