I think the issue of causal interpolation comes up. From where I’m standing right now, the tortured copies never become important in my future; what I’m doing with the boxes is sort of smooth out the becoming-important-ness so that even if I turn out to be a losing copy, I will identify with the winning copy since they’re what dominates the future. Call it mangled-priorities. You could effectively threaten me by releasing the tortured copies into my future-coexistence, at which point it might be the most practical solution for my tortured copies to chose suicide, since they wouldn’t want their broken existence to dominate set-of-copies-that-are-me-and-causally-interacting’s future. How the situation would evolve if the tortured copies never interacted again—I don’t know. I’d need to ask a superintelligence what ought to determine anticipation of subjective existence.
[edit] Honestly, what I’m really doing is trying to precommit to the stance that maximizes my future effectiveness.
I think the issue of causal interpolation comes up. From where I’m standing right now, the tortured copies never become important in my future; what I’m doing with the boxes is sort of smooth out the becoming-important-ness so that even if I turn out to be a losing copy, I will identify with the winning copy since they’re what dominates the future. Call it mangled-priorities. You could effectively threaten me by releasing the tortured copies into my future-coexistence, at which point it might be the most practical solution for my tortured copies to chose suicide, since they wouldn’t want their broken existence to dominate set-of-copies-that-are-me-and-causally-interacting’s future. How the situation would evolve if the tortured copies never interacted again—I don’t know. I’d need to ask a superintelligence what ought to determine anticipation of subjective existence.
[edit] Honestly, what I’m really doing is trying to precommit to the stance that maximizes my future effectiveness.