When it comes to questions like whether you “should” consider destructive uploading, it seems to me that it depends upon what the alternatives are, not just a position on personal identity.
If the only viable alternative is dying anyway in a short or horrible time and the future belongs only to entities that do not behave based on my memories, personality, beliefs, and values then I might consider uploading even in the case where that seems like suicide to the physical me. Having some expectation of personally experiencing being that entity is a bonus, but not entirely necessary.
Conversely if my expected lifespan is otherwise long and likely to be fairly good then I may decline destructive uploading even if I’m very confident (somehow?) in personally experiencing being that upload and it seems likely that the upload would on median have a better life. For one thing, people may devise non-destructive uploading later. For another, uploads seem more vulnerable to future s-risks or major changes in things that I currently consider part of my core identity.
Even non-destructive uploading might not be that attractive if it’s very expensive or otherwise onerous on the physical me, or likely to result in the upload having a poor quality of life or being very much not-me in measurable ways.
It seems extremely likely that the uploads would believe (or behave as if they believe, in the hypothetical where they’re not conscious beings) in continuity of personal identity across uploading.
It also seems like an adaptive belief even if false as it allows strictly more options for agents that hold it than for those that don’t.
All agreeable. Note, this is perfectly compatible with the relativity theory I propose, i.e. with the ‘should’ being entirely up to your intuition only. And, actually, the relativity theory, I’d argue, is the only way to settle debates you invoke, or, say, to give you peace of mind when facing these risky uploading situations.
Say, you can overnight destructively upload, with 100% reliability your digital clone will be in a nicely replicated digital world for 80 years (let’s for simplicity assume for now the uploadee can be expected to be a consciousness comparable to us at all), while ‘you’ might otherwise overnight be killed with x% probability. Think of x as a concrete number. Say we have a 50% chance. For that X, will you want to upload.
I’m pretty certain you have no (i) clear-cut answer as to threshold x% from which on you’d prefer upload (although some might have a value of roughly 100%). And, clearly (ii) that threshold x% would vary a lot across persons.
Who can say? Only my relativity theory: There is no objective answer, from your self-regarding perspective.
Just like it’s your intrinsic taste who determines how much or whether at all you care a lot about the faraway poor, or the not so faraway not so poor or for anyone really: it’s a matter of your taste and nothing else. You’re right now imagining going from you to inside the machine, and feel like that’s simply you being you there w/o much dread and no worries, and looking forward to that being—or sequence of beings - ‘living’ essentially with nearly certainty another 80 years, then yes, you’re right, go for it if the physical killing probability x% is more than a few %. After all, there will be that being in the machine, and for all intents and purposes you might call it ‘you’ in sloppy speak. You dread the future sequence of your physical you being destroyed and to ‘only’ be replaced by what feels like ‘obviously a non-equivalent future entity that merely has copied traits, even if it behaves just as if it was the future you’, then you’re right to refuse the upload for any x% not close enough to 0%. It really is, relative. You only are the current you, including weights-of-care for different potential future successors on which there’s no outside authority to tell you which ones are right or wrong.
When it comes to questions like whether you “should” consider destructive uploading, it seems to me that it depends upon what the alternatives are, not just a position on personal identity.
If the only viable alternative is dying anyway in a short or horrible time and the future belongs only to entities that do not behave based on my memories, personality, beliefs, and values then I might consider uploading even in the case where that seems like suicide to the physical me. Having some expectation of personally experiencing being that entity is a bonus, but not entirely necessary.
Conversely if my expected lifespan is otherwise long and likely to be fairly good then I may decline destructive uploading even if I’m very confident (somehow?) in personally experiencing being that upload and it seems likely that the upload would on median have a better life. For one thing, people may devise non-destructive uploading later. For another, uploads seem more vulnerable to future s-risks or major changes in things that I currently consider part of my core identity.
Even non-destructive uploading might not be that attractive if it’s very expensive or otherwise onerous on the physical me, or likely to result in the upload having a poor quality of life or being very much not-me in measurable ways.
It seems extremely likely that the uploads would believe (or behave as if they believe, in the hypothetical where they’re not conscious beings) in continuity of personal identity across uploading.
It also seems like an adaptive belief even if false as it allows strictly more options for agents that hold it than for those that don’t.
All agreeable. Note, this is perfectly compatible with the relativity theory I propose, i.e. with the ‘should’ being entirely up to your intuition only. And, actually, the relativity theory, I’d argue, is the only way to settle debates you invoke, or, say, to give you peace of mind when facing these risky uploading situations.
Say, you can overnight destructively upload, with 100% reliability your digital clone will be in a nicely replicated digital world for 80 years (let’s for simplicity assume for now the uploadee can be expected to be a consciousness comparable to us at all), while ‘you’ might otherwise overnight be killed with x% probability. Think of x as a concrete number. Say we have a 50% chance. For that X, will you want to upload.
I’m pretty certain you have no (i) clear-cut answer as to threshold x% from which on you’d prefer upload (although some might have a value of roughly 100%). And, clearly (ii) that threshold x% would vary a lot across persons.
Who can say? Only my relativity theory: There is no objective answer, from your self-regarding perspective.
Just like it’s your intrinsic taste who determines how much or whether at all you care a lot about the faraway poor, or the not so faraway not so poor or for anyone really: it’s a matter of your taste and nothing else. You’re right now imagining going from you to inside the machine, and feel like that’s simply you being you there w/o much dread and no worries, and looking forward to that being—or sequence of beings - ‘living’ essentially with nearly certainty another 80 years, then yes, you’re right, go for it if the physical killing probability x% is more than a few %. After all, there will be that being in the machine, and for all intents and purposes you might call it ‘you’ in sloppy speak. You dread the future sequence of your physical you being destroyed and to ‘only’ be replaced by what feels like ‘obviously a non-equivalent future entity that merely has copied traits, even if it behaves just as if it was the future you’, then you’re right to refuse the upload for any x% not close enough to 0%. It really is, relative. You only are the current you, including weights-of-care for different potential future successors on which there’s no outside authority to tell you which ones are right or wrong.