My difficulty here is that the difference between making a choice for myself and making it for someone else actually does seem to matter to me, so reasoning from analogy to the “torture someone else” scenario isn’t obviously legitimate.
That is: let’s assume that given that choice, I would forego immortality. (Truthfully, I don’t know what I would do in that situation, and I doubt anyone else does either. I suspect it depends enormously on how the choice is framed.) It doesn’t necessarily follow that I would forego immortality in exchange for subjecting myself to it.
This is similar to the sense in which I might be willing to die to save a loved one’s life, but it doesn’t follow that I’d be willing to kill for it. It seems to matter whether or not the person I’m assigning a negative consequence to is me.
It doesn’t necessarily follow that I would forego immortality in exchange for subjecting myself to it.
But then you’re talking about putting a future-you into a situation where you know that experiences will dramatically reshape that future-you’s priorities and values, to the point where TheOtherDave(torture1000)’s decisions and preferences would diverge markedly from your current ones. I think making this decision for TheOtherDave(torture1000) is a lot like making it for someone else, given that you know TheOtherDave(torture1000) is going to object violently to this decision.
My difficulty here is that the difference between making a choice for myself and making it for someone else actually does seem to matter to me, so reasoning from analogy to the “torture someone else” scenario isn’t obviously legitimate.
That is: let’s assume that given that choice, I would forego immortality. (Truthfully, I don’t know what I would do in that situation, and I doubt anyone else does either. I suspect it depends enormously on how the choice is framed.) It doesn’t necessarily follow that I would forego immortality in exchange for subjecting myself to it.
This is similar to the sense in which I might be willing to die to save a loved one’s life, but it doesn’t follow that I’d be willing to kill for it. It seems to matter whether or not the person I’m assigning a negative consequence to is me.
It doesn’t necessarily follow that I would forego immortality in exchange for subjecting myself to it.
But then you’re talking about putting a future-you into a situation where you know that experiences will dramatically reshape that future-you’s priorities and values, to the point where TheOtherDave(torture1000)’s decisions and preferences would diverge markedly from your current ones. I think making this decision for TheOtherDave(torture1000) is a lot like making it for someone else, given that you know TheOtherDave(torture1000) is going to object violently to this decision.