OK imagine somewhere far away in the universe—or maybe one room over, of doesn’t matter—there is an exact physical replica of you that is also through some genius engineering being provided the exact same percepts (sight, hearing, touch, etc.) that you do. It’s mental states remain exactly identical to yours.
Should you still care? To me it’d still be someone different.
Suppose I offer you a dollar in return for making a trillion virtual copies of you and shooting them all with a gun, with the promise that I won’t make any copies until after you agree. Since the copies haven’t been made yet, this ensures that you must be the original, and since you don’t care about any identical copies of yours since they’re technically different people from you, you happily agree. I nod, pull out a gun, and shoot you.
(In the real universe—or at least the universe one level up on the simulation hierarchy—a Mark Friedenbach receives a dollar. This isn’t of much comfort to you, of course, seeing as you’re dead.)
You shouldn’t murder sentient beings or cause them to be murdered by trillions. Both are generally considered dick moves. Shame on you both. My argument: a benefit to an exact copy is of no intrinsic benefit to a different copy or original. Unless some Omega starts playing evil UFAI games with them. One trillion other copies are unaffected by this murder. Original or copy is irrelevant. It is the being we are currently discussing that is relevant. If I am the original I care about myself. If I am a copy I care about myself. Whether or not I even care if I’m a copy or not depends on various aspects of my personality.
If I offered you the same deal I offered to Mark Friedenbach, would you agree? (Please answer with “yes” or “no”. You’re free to expand on your answer, but first please make sure you give an answer.)
Same question and they’re not copies of me? Same answer.
As I’m sure you’re aware, the purpose of these thought experiments is to investigate what exactly your view of consciousness entails from a decision-making perspective. The fact that you would have given the same answer even if the virtual instances weren’t copies of you shows that your reason for saying “no” has nothing to do with the purpose of the question. In particular, telling me that “it’s a dick move” does not help elucidate your view of consciousness and self, and thus does not advance the conversation. But since you insist, I will rephrase my question:
Would someone who shares your views on consciousness but doesn’t give a crap about other people say “yes” or “no” to my deal?
Sorry if my attempt at coloring the conversation with humor upset you. That was not my intent. However, you will find it did nothing to alter the content of our discourse. You have changed your question. The question you ask now is not the question you asked previously.
Previous question: No, I do not choose to murder trillions of sentient me-copies for personal gain. I added an addendum, to provide you with further information, perhaps presuming a future question: Neither would I murder trillions of sentient not-me copies.
New question: Yes, an amoral dick who shares my views on consciousness would say yes.
Care in terms of what? You have no way of knowing which one you are, so if you’re offered the option to help the one in the left room, you should, because there’s a 50% chance that’s you. I would say it’s not well defined whether you’re one or the other, actually, you’re both until an “observation/divergence”. But what specific decision hinges on the question?
OK imagine somewhere far away in the universe—or maybe one room over, of doesn’t matter—there is an exact physical replica of you that is also through some genius engineering being provided the exact same percepts (sight, hearing, touch, etc.) that you do. It’s mental states remain exactly identical to yours.
Should you still care? To me it’d still be someone different.
Suppose I offer you a dollar in return for making a trillion virtual copies of you and shooting them all with a gun, with the promise that I won’t make any copies until after you agree. Since the copies haven’t been made yet, this ensures that you must be the original, and since you don’t care about any identical copies of yours since they’re technically different people from you, you happily agree. I nod, pull out a gun, and shoot you.
(In the real universe—or at least the universe one level up on the simulation hierarchy—a Mark Friedenbach receives a dollar. This isn’t of much comfort to you, of course, seeing as you’re dead.)
You shouldn’t murder sentient beings or cause them to be murdered by trillions. Both are generally considered dick moves. Shame on you both. My argument: a benefit to an exact copy is of no intrinsic benefit to a different copy or original. Unless some Omega starts playing evil UFAI games with them. One trillion other copies are unaffected by this murder. Original or copy is irrelevant. It is the being we are currently discussing that is relevant. If I am the original I care about myself. If I am a copy I care about myself. Whether or not I even care if I’m a copy or not depends on various aspects of my personality.
If I offered you the same deal I offered to Mark Friedenbach, would you agree? (Please answer with “yes” or “no”. You’re free to expand on your answer, but first please make sure you give an answer.)
No. It’s a dick move. Same question and they’re not copies of me? Same answer.
As I’m sure you’re aware, the purpose of these thought experiments is to investigate what exactly your view of consciousness entails from a decision-making perspective. The fact that you would have given the same answer even if the virtual instances weren’t copies of you shows that your reason for saying “no” has nothing to do with the purpose of the question. In particular, telling me that “it’s a dick move” does not help elucidate your view of consciousness and self, and thus does not advance the conversation. But since you insist, I will rephrase my question:
Would someone who shares your views on consciousness but doesn’t give a crap about other people say “yes” or “no” to my deal?
Sorry if my attempt at coloring the conversation with humor upset you. That was not my intent. However, you will find it did nothing to alter the content of our discourse. You have changed your question. The question you ask now is not the question you asked previously.
Previous question: No, I do not choose to murder trillions of sentient me-copies for personal gain. I added an addendum, to provide you with further information, perhaps presuming a future question: Neither would I murder trillions of sentient not-me copies.
New question: Yes, an amoral dick who shares my views on consciousness would say yes.
No, I don’t want you to murder a trillion people, even if those people are not me.
Care in terms of what? You have no way of knowing which one you are, so if you’re offered the option to help the one in the left room, you should, because there’s a 50% chance that’s you. I would say it’s not well defined whether you’re one or the other, actually, you’re both until an “observation/divergence”. But what specific decision hinges on the question?