The thing is, I’m genuinely not sure if it matters. To restate what you’re doing another way, “If I make a copy of you every night and suspend it until morning, and also there’s a you that gets tortured but it never causally affects anything else”—I think if you’re vulnerable to coercion via that, you’d also have to be vulnerable to “a thousand tortured copies in a box” style arguments.
You may have missed the long addition I just made to my comment, which avoids the torture issue… however, being vulnerable to “a thousand tortured copies in a box” is not necessarily a bad thing! Just because viewing outcome A as bad renders you vulnerable to blackmail by the threat of A, doesn’t automatically mean that you should change your attitude to A. Otherwise, why not just accept death and the natural lifespan, rather than bother with expensive attempts to live, like cryonics? If you care about dying, you end up spending all this time and energy trying to stay alive, when you could just be enjoying life; so why not change your value system and save yourself the trouble of unnatural life extension… I hope you see the analogy.
I can’t say I do. Death doesn’t care what I think. Other actors may care how you perceive things. Ironically, if you want to minimize torture for coercion, it may be most effective to ignore it. Like not negotiating with terrorists.
On one hand you’re saying it’s good to identify with your copies, because then you can play iterated russian roulette and win. On the other hand, you’re saying it’s bad to identify with your copies, to the extent of caring whether someone tortures them. Presumably you don’t want to be tortured, and your copies don’t want to be tortured, and your copies are you, but you don’t care whether they are tortured… congratulations, I think you’ve invented strategic identity hypocrisy for uploads!
I think the issue of causal interpolation comes up. From where I’m standing right now, the tortured copies never become important in my future; what I’m doing with the boxes is sort of smooth out the becoming-important-ness so that even if I turn out to be a losing copy, I will identify with the winning copy since they’re what dominates the future. Call it mangled-priorities. You could effectively threaten me by releasing the tortured copies into my future-coexistence, at which point it might be the most practical solution for my tortured copies to chose suicide, since they wouldn’t want their broken existence to dominate set-of-copies-that-are-me-and-causally-interacting’s future. How the situation would evolve if the tortured copies never interacted again—I don’t know. I’d need to ask a superintelligence what ought to determine anticipation of subjective existence.
[edit] Honestly, what I’m really doing is trying to precommit to the stance that maximizes my future effectiveness.
The thing is, I’m genuinely not sure if it matters. To restate what you’re doing another way, “If I make a copy of you every night and suspend it until morning, and also there’s a you that gets tortured but it never causally affects anything else”—I think if you’re vulnerable to coercion via that, you’d also have to be vulnerable to “a thousand tortured copies in a box” style arguments.
You may have missed the long addition I just made to my comment, which avoids the torture issue… however, being vulnerable to “a thousand tortured copies in a box” is not necessarily a bad thing! Just because viewing outcome A as bad renders you vulnerable to blackmail by the threat of A, doesn’t automatically mean that you should change your attitude to A. Otherwise, why not just accept death and the natural lifespan, rather than bother with expensive attempts to live, like cryonics? If you care about dying, you end up spending all this time and energy trying to stay alive, when you could just be enjoying life; so why not change your value system and save yourself the trouble of unnatural life extension… I hope you see the analogy.
I can’t say I do. Death doesn’t care what I think. Other actors may care how you perceive things. Ironically, if you want to minimize torture for coercion, it may be most effective to ignore it. Like not negotiating with terrorists.
On one hand you’re saying it’s good to identify with your copies, because then you can play iterated russian roulette and win. On the other hand, you’re saying it’s bad to identify with your copies, to the extent of caring whether someone tortures them. Presumably you don’t want to be tortured, and your copies don’t want to be tortured, and your copies are you, but you don’t care whether they are tortured… congratulations, I think you’ve invented strategic identity hypocrisy for uploads!
I think the issue of causal interpolation comes up. From where I’m standing right now, the tortured copies never become important in my future; what I’m doing with the boxes is sort of smooth out the becoming-important-ness so that even if I turn out to be a losing copy, I will identify with the winning copy since they’re what dominates the future. Call it mangled-priorities. You could effectively threaten me by releasing the tortured copies into my future-coexistence, at which point it might be the most practical solution for my tortured copies to chose suicide, since they wouldn’t want their broken existence to dominate set-of-copies-that-are-me-and-causally-interacting’s future. How the situation would evolve if the tortured copies never interacted again—I don’t know. I’d need to ask a superintelligence what ought to determine anticipation of subjective existence.
[edit] Honestly, what I’m really doing is trying to precommit to the stance that maximizes my future effectiveness.
Nah, I care about the copies that can interact with me in the future.
[edit] No that doesn’t work. Rethinking.