All agreeable. Note, this is perfectly compatible with the relativity theory I propose, i.e. with the ‘should’ being entirely up to your intuition only. And, actually, the relativity theory, I’d argue, is the only way to settle debates you invoke, or, say, to give you peace of mind when facing these risky uploading situations.
Say, you can overnight destructively upload, with 100% reliability your digital clone will be in a nicely replicated digital world for 80 years (let’s for simplicity assume for now the uploadee can be expected to be a consciousness comparable to us at all), while ‘you’ might otherwise overnight be killed with x% probability. Think of x as a concrete number. Say we have a 50% chance. For that X, will you want to upload.
I’m pretty certain you have no (i) clear-cut answer as to threshold x% from which on you’d prefer upload (although some might have a value of roughly 100%). And, clearly (ii) that threshold x% would vary a lot across persons.
Who can say? Only my relativity theory: There is no objective answer, from your self-regarding perspective.
Just like it’s your intrinsic taste who determines how much or whether at all you care a lot about the faraway poor, or the not so faraway not so poor or for anyone really: it’s a matter of your taste and nothing else. You’re right now imagining going from you to inside the machine, and feel like that’s simply you being you there w/o much dread and no worries, and looking forward to that being—or sequence of beings - ‘living’ essentially with nearly certainty another 80 years, then yes, you’re right, go for it if the physical killing probability x% is more than a few %. After all, there will be that being in the machine, and for all intents and purposes you might call it ‘you’ in sloppy speak. You dread the future sequence of your physical you being destroyed and to ‘only’ be replaced by what feels like ‘obviously a non-equivalent future entity that merely has copied traits, even if it behaves just as if it was the future you’, then you’re right to refuse the upload for any x% not close enough to 0%. It really is, relative. You only are the current you, including weights-of-care for different potential future successors on which there’s no outside authority to tell you which ones are right or wrong.
All agreeable. Note, this is perfectly compatible with the relativity theory I propose, i.e. with the ‘should’ being entirely up to your intuition only. And, actually, the relativity theory, I’d argue, is the only way to settle debates you invoke, or, say, to give you peace of mind when facing these risky uploading situations.
Say, you can overnight destructively upload, with 100% reliability your digital clone will be in a nicely replicated digital world for 80 years (let’s for simplicity assume for now the uploadee can be expected to be a consciousness comparable to us at all), while ‘you’ might otherwise overnight be killed with x% probability. Think of x as a concrete number. Say we have a 50% chance. For that X, will you want to upload.
I’m pretty certain you have no (i) clear-cut answer as to threshold x% from which on you’d prefer upload (although some might have a value of roughly 100%). And, clearly (ii) that threshold x% would vary a lot across persons.
Who can say? Only my relativity theory: There is no objective answer, from your self-regarding perspective.
Just like it’s your intrinsic taste who determines how much or whether at all you care a lot about the faraway poor, or the not so faraway not so poor or for anyone really: it’s a matter of your taste and nothing else. You’re right now imagining going from you to inside the machine, and feel like that’s simply you being you there w/o much dread and no worries, and looking forward to that being—or sequence of beings - ‘living’ essentially with nearly certainty another 80 years, then yes, you’re right, go for it if the physical killing probability x% is more than a few %. After all, there will be that being in the machine, and for all intents and purposes you might call it ‘you’ in sloppy speak. You dread the future sequence of your physical you being destroyed and to ‘only’ be replaced by what feels like ‘obviously a non-equivalent future entity that merely has copied traits, even if it behaves just as if it was the future you’, then you’re right to refuse the upload for any x% not close enough to 0%. It really is, relative. You only are the current you, including weights-of-care for different potential future successors on which there’s no outside authority to tell you which ones are right or wrong.