I’m just not sure about the way you’re discounting the preferences of Nornagest(torture1000). In my imagination he’s there, screaming for the pain to stop, screaming that he takes it back, he takes it back, he takes it back. His preferences are so diametrically opposed to “yours” (the Nornagest making this decision now) that I almost question your right to make this decision for him.
Well, I actually do believe that Gest(torture1000)’s preferences are consistent with my current ones, absent the stress and lingering aftereffects of the uploading process. That is, if Omega were to pause halfway though that subjective thousand years and offer him a cup of tea and n subjective years of therapy for the inevitable post-traumatic problems, at the end of it I think he’d agree that Gest(now) made the right choice.
I don’t think that predictably biased future preferences ought to be taken into consideration without adjusting for the bias. Let’s say I’m about to go to a party some distance away. I predict that I’ll want to drive home drunk after it; I’m also aware both that that’s a bad idea and that I won’t think it’s a bad idea six hours from now. Giving my car keys to the host predictably violates my future preferences, but I’m willing to overlook this to eliminate the possibility of wrapping my car around a fire hydrant.
That is, if Omega were to pause halfway though that subjective thousand years and offer him a cup of tea and n subjective years of therapy for the inevitable post-traumatic problems, at the end of it I think he’d agree that Gest(now) made the right choice.
If I accept that’s true, my moral objection goes away.
I can imagine myself agreeing to be tortured in exchange for someone I love being allowed to go free. I expect that, if that offer were accepted, shortly thereafter I would agree to let my loved one be tortured in my stead if that will only make the pain stop. I expect that, if that request were granted, I would regret that choice and might in fact even agree to be tortured again.
It would not surprise me to discover that I could toggle between those states several times until I eventually had a nervous breakdown.
It’s really unclear to me how I’m supposed to account for these future selves’ expressed preferences, in that case.
It’s really unclear to me how I’m supposed to account for these future selves’ expressed preferences, in that case.
In the case that the tortured-you would make the same decision all over again, my intuition (I think) agrees with yours. My objection is basically to splitting off “selves” and subjecting them to things that the post-split self would never consent to.
OTOH, I do think I can consent now to consequences that my future self will have to suffer, even if my future self will at that point—when the benefits are past, and the costs are current—withdraw that consent.
I’m just not sure about the way you’re discounting the preferences of Nornagest(torture1000). In my imagination he’s there, screaming for the pain to stop, screaming that he takes it back, he takes it back, he takes it back. His preferences are so diametrically opposed to “yours” (the Nornagest making this decision now) that I almost question your right to make this decision for him.
Well, I actually do believe that Gest(torture1000)’s preferences are consistent with my current ones, absent the stress and lingering aftereffects of the uploading process. That is, if Omega were to pause halfway though that subjective thousand years and offer him a cup of tea and n subjective years of therapy for the inevitable post-traumatic problems, at the end of it I think he’d agree that Gest(now) made the right choice.
I don’t think that predictably biased future preferences ought to be taken into consideration without adjusting for the bias. Let’s say I’m about to go to a party some distance away. I predict that I’ll want to drive home drunk after it; I’m also aware both that that’s a bad idea and that I won’t think it’s a bad idea six hours from now. Giving my car keys to the host predictably violates my future preferences, but I’m willing to overlook this to eliminate the possibility of wrapping my car around a fire hydrant.
That is, if Omega were to pause halfway though that subjective thousand years and offer him a cup of tea and n subjective years of therapy for the inevitable post-traumatic problems, at the end of it I think he’d agree that Gest(now) made the right choice.
If I accept that’s true, my moral objection goes away.
Hm.
I can imagine myself agreeing to be tortured in exchange for someone I love being allowed to go free. I expect that, if that offer were accepted, shortly thereafter I would agree to let my loved one be tortured in my stead if that will only make the pain stop. I expect that, if that request were granted, I would regret that choice and might in fact even agree to be tortured again.
It would not surprise me to discover that I could toggle between those states several times until I eventually had a nervous breakdown.
It’s really unclear to me how I’m supposed to account for these future selves’ expressed preferences, in that case.
It’s really unclear to me how I’m supposed to account for these future selves’ expressed preferences, in that case.
In the case that the tortured-you would make the same decision all over again, my intuition (I think) agrees with yours. My objection is basically to splitting off “selves” and subjecting them to things that the post-split self would never consent to.
(nods) That’s reasonable.
OTOH, I do think I can consent now to consequences that my future self will have to suffer, even if my future self will at that point—when the benefits are past, and the costs are current—withdraw that consent.