You shouldn’t murder sentient beings or cause them to be murdered by trillions. Both are generally considered dick moves. Shame on you both. My argument: a benefit to an exact copy is of no intrinsic benefit to a different copy or original. Unless some Omega starts playing evil UFAI games with them. One trillion other copies are unaffected by this murder. Original or copy is irrelevant. It is the being we are currently discussing that is relevant. If I am the original I care about myself. If I am a copy I care about myself. Whether or not I even care if I’m a copy or not depends on various aspects of my personality.
If I offered you the same deal I offered to Mark Friedenbach, would you agree? (Please answer with “yes” or “no”. You’re free to expand on your answer, but first please make sure you give an answer.)
Same question and they’re not copies of me? Same answer.
As I’m sure you’re aware, the purpose of these thought experiments is to investigate what exactly your view of consciousness entails from a decision-making perspective. The fact that you would have given the same answer even if the virtual instances weren’t copies of you shows that your reason for saying “no” has nothing to do with the purpose of the question. In particular, telling me that “it’s a dick move” does not help elucidate your view of consciousness and self, and thus does not advance the conversation. But since you insist, I will rephrase my question:
Would someone who shares your views on consciousness but doesn’t give a crap about other people say “yes” or “no” to my deal?
Sorry if my attempt at coloring the conversation with humor upset you. That was not my intent. However, you will find it did nothing to alter the content of our discourse. You have changed your question. The question you ask now is not the question you asked previously.
Previous question: No, I do not choose to murder trillions of sentient me-copies for personal gain. I added an addendum, to provide you with further information, perhaps presuming a future question: Neither would I murder trillions of sentient not-me copies.
New question: Yes, an amoral dick who shares my views on consciousness would say yes.
You shouldn’t murder sentient beings or cause them to be murdered by trillions. Both are generally considered dick moves. Shame on you both. My argument: a benefit to an exact copy is of no intrinsic benefit to a different copy or original. Unless some Omega starts playing evil UFAI games with them. One trillion other copies are unaffected by this murder. Original or copy is irrelevant. It is the being we are currently discussing that is relevant. If I am the original I care about myself. If I am a copy I care about myself. Whether or not I even care if I’m a copy or not depends on various aspects of my personality.
If I offered you the same deal I offered to Mark Friedenbach, would you agree? (Please answer with “yes” or “no”. You’re free to expand on your answer, but first please make sure you give an answer.)
No. It’s a dick move. Same question and they’re not copies of me? Same answer.
As I’m sure you’re aware, the purpose of these thought experiments is to investigate what exactly your view of consciousness entails from a decision-making perspective. The fact that you would have given the same answer even if the virtual instances weren’t copies of you shows that your reason for saying “no” has nothing to do with the purpose of the question. In particular, telling me that “it’s a dick move” does not help elucidate your view of consciousness and self, and thus does not advance the conversation. But since you insist, I will rephrase my question:
Would someone who shares your views on consciousness but doesn’t give a crap about other people say “yes” or “no” to my deal?
Sorry if my attempt at coloring the conversation with humor upset you. That was not my intent. However, you will find it did nothing to alter the content of our discourse. You have changed your question. The question you ask now is not the question you asked previously.
Previous question: No, I do not choose to murder trillions of sentient me-copies for personal gain. I added an addendum, to provide you with further information, perhaps presuming a future question: Neither would I murder trillions of sentient not-me copies.
New question: Yes, an amoral dick who shares my views on consciousness would say yes.