I appreciate the reply. I recognize both of those arguments but I am asking something different. If Omega tells me to give him a dollar or he tortures a simulation, a separate being to me, no threat that I might be that simulation (also thinking of the Basilisk here), why should I care if that simulation is one of me as opposed to any other sentient being?
I see them as equally valuable. Both are not-me. Identical-to-me is still not-me. If I am a simulation and I meet another simulation of me in Thunderdome (Omega is an evil bastard) I’m going to kill that other guy just the same as if he were someone else. I don’t get why sim-self is of greater value than sim-other. Everything I’ve read here (admittedly not too much) seems to assume this as self-evident but I can’t find a basis for it. Is the “it could be you who is tortured” just implied in all of these examples and I’m not up on the convention? I don’t see it specified, and in “The AI boxes you” the “It could be you” is a tacked-on threat in addition to the “I will torture simulations of you”, implying that the starting threat is enough to give pause.
If love your simulation as you love yourself, they will love you as they love themselves (and if you don’t, they won’t). You can choose to have enemies or allies with your own actions.
You and a thousand simulations of you play a game where pressing a button gives the presser $500 but takes $1 from each of the other players. Do you press the button?
I don’t play, craps is the only sucker bet I enjoy engaging in. But if coerced to play, I press with non-sims. Don’t press with sims. But not out of love, out of an intimate knowledge of my opponent’s expected actions. Out of my status as a reliable predictor in this unique circumstance.
I appreciate the reply. I recognize both of those arguments but I am asking something different. If Omega tells me to give him a dollar or he tortures a simulation, a separate being to me, no threat that I might be that simulation (also thinking of the Basilisk here), why should I care if that simulation is one of me as opposed to any other sentient being?
I see them as equally valuable. Both are not-me. Identical-to-me is still not-me. If I am a simulation and I meet another simulation of me in Thunderdome (Omega is an evil bastard) I’m going to kill that other guy just the same as if he were someone else. I don’t get why sim-self is of greater value than sim-other. Everything I’ve read here (admittedly not too much) seems to assume this as self-evident but I can’t find a basis for it. Is the “it could be you who is tortured” just implied in all of these examples and I’m not up on the convention? I don’t see it specified, and in “The AI boxes you” the “It could be you” is a tacked-on threat in addition to the “I will torture simulations of you”, implying that the starting threat is enough to give pause.
If love your simulation as you love yourself, they will love you as they love themselves (and if you don’t, they won’t). You can choose to have enemies or allies with your own actions.
You and a thousand simulations of you play a game where pressing a button gives the presser $500 but takes $1 from each of the other players. Do you press the button?
I don’t play, craps is the only sucker bet I enjoy engaging in. But if coerced to play, I press with non-sims. Don’t press with sims. But not out of love, out of an intimate knowledge of my opponent’s expected actions. Out of my status as a reliable predictor in this unique circumstance.