All your arguments really prove is that if your copy diverges from you, it’s not you anymore. But that’s only because once something happens to your copy but not to you, you know which one you are. The import of “you have no way of knowing which copy you are” disappears. Conversely, if you don’t know which one you are, then both must be your consciousness, because you know you are conscious.
Edit: the last point is not strictly rigorous, you could know that one is conscious but not know which, but it seems to me that if you know every relevant detail of both are equal, and don’t know which you are, then they both must be conscious (anti-zombie principle, whatever) and since you can’t tell which you are, there’s a sense in which you’re “both”. That probably has subtle objections, but nothing that bothers me right now. If anyone wants to argue against that, I’d be interested; I just didn’t think this post was really doing that, based on the examples given where the copy diverges.
Thanks for the reply. To your last point, I am not speaking of zombies. Every copy I discussed above is assumed to have its own consciousness. To your first points, at no time is there any ambiguity or import to the question of “which one I am”. I know which one I am, because here I am, in the meat (or in my own sim, it makes no difference). When a copy is made its existence is irrelevant, even if it should live in a perfect sim and deviate not one bit, I do not experience what it experiences. It is perhaps identical or congruent to me. It is not me.
My argument is, boiled down: That your transhuman copy is of questionable value to your meat self. For the reasons stated above (chiefly that “You” are the result of activity in a specific brain), fuck that guy. You don’t owe him an existence. If you that are reading this ever upload with brain destruction, you will have committed suicide. If you upload without brain destruction you will live the rest of your meat life and die. If you brain-freeze, something perfectly you-like will live after you die with zero effect on you.
I stand by that argument, but, this being a thorny issue, I have received a lot of great feedback to think on.
To your first points, at no time is there any ambiguity or import to the question of “which one I am”. I know which one I am, because here I am, in the meat (or in my own sim, it makes no difference). When a copy is made its existence is irrelevant, even if it should live in a perfect sim and deviate not one bit, I do not experience what it experiences. It is perhaps identical or congruent to me. It is not me.
Can you explain how you know that you’re the meat space one? If every observation you make is made by the other one, and both are conscious, how do you know you’re not a sim? This is a purely epistemic question.
I’m perfectly happy saying ” if I am meat, then fuck sim-me, and if I am sim, fuck meat me” (assuming selfishness). But if you don’t know which one you are, you need to act to benefit both, because you might be both.
On the other hand, if you see the other one, there’s no problem fighting it, because the one you’re fighting is surely not you. (But even so, if you expect them to do the same as you, then you’re in a perfect prisoner dilemma and should cooperate.)
On the other hand, I think that if I clone myself, then do stuff my clone doesn’t do, I’d still be less worried about dying than if I had no clone. I model that as “when you die, some memories are wiped and you live again”. If you concede that wiping a couple days of memory doesn’t destroy the person, then I think that’s enough for me. Probably not you, though. What’s the specific argument in that case?
Meat or sim or both meat aren’t issues as I see it. I am self, fuck non-self, or not to be a dick, value non-self less than self, certainly on existential issues. “I” am the awareness within this mind. “I” am not memory, thought, feeling, personality, etc. I know I am me ipso facto as the observer of the knower. I don’t care if I was cloned yesterday or one second ago and there are many theoretical circumstances where this could be the case. I value the “I” that I currently am just exactly now. I don’t believe that this “I” is particularly changeable. I fear senility because “I” am the entity which will be aware of the unpleasant thoughts and feeling associated with the memory loss and the fear of worsening status and eventual nightmarish half-life of idiocy. That being will be unrecognizable as me on many levels but it is me whereas a perfect non-senile copy is not me, although he has the experience of feeling exactly as I would, including the same stubborn ideas about his own importance over any other copies or originals.
I don’t believe that this “I” is particularly changeable
I don’t know what you mean by that.
Why can’t a perfect copy be you? Doesn’t that involve epiphenomenalism? Even if I give the entire state of the world X time in the future, I’d also need to specify which identical beings are “you”.
It’s a sticky topic, consciousness. I edited my post to clarify further:
I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding.
Which I recognize might sound a bit mystical to some, but I believe it is a real thing which is a function of brain activity.
As a function of brain (or whatever processing medium) consciousness or self is tied to matter. The consciousness in the matter that is experiencing this consciousness is me. I’m not sure if any transfer to alternate media is possible. The same matter can’t be in two different places. Therefore every consciousness is a unique entity, although identical ones can exist via copying. I am the one aware of this mind as the body is typing. You are the one aware of the mind reading it. Another might have the same experience but that won’t have any intrinsic value to You or I.
If I copy myself and am destroyed in the process, is the copy me? If I copy myself and am not destroyed, are the copy and the original both me? If I am a product of brain function (otherwise I am a magical soul) and if both are me then my brain is a single set of matter in two locations. Are they We? That gets interesting. Lots to think about but I stand with my original position.
If I gradually replace every atom in your brain with a different one until you have no atoms left, but you still function the same, are you still “you”? If not, at what point did you stop?
Have you seen Yudkowsky’s series of posts on this?
I’m familiar with the concept, not his specific essays, and, indeed, this literally does happen. Our neurons are constantly making and unmaking largely proteinaceous components of themselves and, over a lifetime, there is a no doubt partial, perhaps complete, turnover of the brain’s atoms. I find this not problematic at all because the electrochemical processes which create consciousness go on undisturbed. The idea that really queers my argument for me is that of datum-by-datum transfer, each new datum, in the form of neuronal activity (the 1s and 0s of electrical discharge or non-) is started by my current brain but saved in another. Knee-jerk, I would tend to say that transfer is a complete one and myself has been maintained. The problem comes when a copy, run in parallel until completed and then cloven (good luck untying that knot), rather than a transfer is made by the exact same datum-by-datum process. At the end I have 2 beings who seem to meet my definition of Me.
However, this argument does not convince me of the contrary position of the sameness of self and copy, and it does nothing to make me care about a me-like copy coming into being a thousand years from now, and does not induce me to step up onto Dr Bowie-Tesla’s machine.
At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago? How about one who appears 1,000,000 years from now? The exact eyes that see these words will be the ones in the water. I can’t come up with any answer other that “fuck that guy”. I might just be a glass-half empty kind of guy, but someone always ends up stuck in the meat, and it’s going to be that being which remains behind these eyes.
Note that you can believe everyone involved is “you”, and yet not care about them. The two questions aren’t completely orthogonal, but identifying someone with yourself doesn’t imply you should care about them.
At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago?
The same price I would accept to have the last second erased from my memory, but first feel the pain of drowning. That’s actually not so easy to set. I’m not sure how much it would cost to get me to accept X amount of pain plus removal of the memory, but it’s probably less than the cost for X amount of pain alone.
How about one who appears 1,000,000 years from now?
That’s like removing the last second of memory, plus pain, plus jumping forward in time. I’d probably only do it if I had a guarantee that I’d survive and be able to get used to whatever goes on in the future and be happy.
“I model that as “when you die, some memories are wiped and you live again”. If you concede that wiping a couple days of memory doesn’t destroy the person, then I think that’s enough for me. Probably not you, though. What’s the specific argument in that case?”
I think I must have missed this part before. Where I differ is in the idea that a copy is “me” living again, I don’t accept that it is, for the reasons previously written. Whether or not a being with a me-identical starting state lives on after I die might be the tiniest of solaces, like a child or a well-respected body of work, but in no way is it “me” living on in any meaningful way that I recognize. I get the exact opposite take on this, but I agree even with a stronger form of your statement to say that “ALL memories are wiped and you live again” (my conditions would require this to read “you continue to live”) is marginally more desirable than “you die and that’s it”. Funny about that.
I get the exact opposite take on this, but I agree even with a stronger form of your statement to say that “ALL memories are wiped and you live again” (my conditions would require this to read “you continue to live”) is marginally more desirable than “you die and that’s it”.
So continuity of consciousness can exist outside of memories? How so? Why is memory-wiped you different than any random memory-wiped person? How can physical continuity do that?
I see factual memory as a highly changeable data set that has very little to do with “self”. As I understand it (not an expert in neuroscience or psychiatry, but experience working with neurologically impaired people) the sort of brain injuries which produce amnesia are quite distinct from those that produce changes in personality, as reported by significant others, and vice versa. In other words, you can lose the memories of “where you came from” and still be recognized as very much the same person by those who knew you, while you can become a very different person in terms of disposition, altered emotional response to identical stimuli relative to pre-injury status, etc (I’m less clear on what constitutes “personality”, but it seems to be more in line with people’s intuitive concept of “self”) with fully intact memories. The idea of a memory wipe and continued existence is certainly a “little death” to my thinking, but marginally preferable to actual death. My idea of consciousness is one of passive reception. The same “I”, or maybe “IT” is better, is there post memory wipe.
If memory is crucial to pattern identity then which has the greater claim to identity: The amnesiac police officer, or his 20 years of dashcam footage and activity logs?
still be recognized as very much the same person by those who knew you
Yes or no, will those who knew them be able to pick them out blind out of a group going only on text-based communication? If not, what do you mean by recognize? (If yes, I’ll be surprised and will need to reevaluate this.)
If memory is crucial to pattern identity then which has the greater claim to identity: The amnesiac police officer, or his 20 years of dashcam footage and activity logs?
The officer can’t work if they’re completely amnesiac. They can’t do much of anything, in fact.
As to your main point: it’s possible that personality changes remain after memory loss, but those personalities are themself caused by experiences and memories. I suppose I was assuming that memory wiped would wash away any recognizable personality. I still do. The kinds of amnesia you’re referring to presumably leave traces of the memory somewhere in the brain, which then affects the brain’s outputs. Unless we can access the brain directly and wipe it ourself, we can’t guarantee everything was forgotten, and it probably does linger on in the subconscious; so that’s not the same as an actual memory wipe.
I believe there is a functional definition of amnesia, loss of factual memory, life skills remain intact. I guess I would call what you are calling a memory wipe a “brain wipe”. I guess I’d call what you are calling memory “total brain content”. If a brain is wiped of all content in the forest is Usul’s idea of consciousness spared? No idea. Total brain reboot? I’d say yes and call that good as dead I think.
I would say probably yes to the text only question. Again, loss of factual memory. But I don’t rate that as a reliable or valid test in this context.
OK imagine somewhere far away in the universe—or maybe one room over, of doesn’t matter—there is an exact physical replica of you that is also through some genius engineering being provided the exact same percepts (sight, hearing, touch, etc.) that you do. It’s mental states remain exactly identical to yours.
Should you still care? To me it’d still be someone different.
Suppose I offer you a dollar in return for making a trillion virtual copies of you and shooting them all with a gun, with the promise that I won’t make any copies until after you agree. Since the copies haven’t been made yet, this ensures that you must be the original, and since you don’t care about any identical copies of yours since they’re technically different people from you, you happily agree. I nod, pull out a gun, and shoot you.
(In the real universe—or at least the universe one level up on the simulation hierarchy—a Mark Friedenbach receives a dollar. This isn’t of much comfort to you, of course, seeing as you’re dead.)
You shouldn’t murder sentient beings or cause them to be murdered by trillions. Both are generally considered dick moves. Shame on you both. My argument: a benefit to an exact copy is of no intrinsic benefit to a different copy or original. Unless some Omega starts playing evil UFAI games with them. One trillion other copies are unaffected by this murder. Original or copy is irrelevant. It is the being we are currently discussing that is relevant. If I am the original I care about myself. If I am a copy I care about myself. Whether or not I even care if I’m a copy or not depends on various aspects of my personality.
If I offered you the same deal I offered to Mark Friedenbach, would you agree? (Please answer with “yes” or “no”. You’re free to expand on your answer, but first please make sure you give an answer.)
Same question and they’re not copies of me? Same answer.
As I’m sure you’re aware, the purpose of these thought experiments is to investigate what exactly your view of consciousness entails from a decision-making perspective. The fact that you would have given the same answer even if the virtual instances weren’t copies of you shows that your reason for saying “no” has nothing to do with the purpose of the question. In particular, telling me that “it’s a dick move” does not help elucidate your view of consciousness and self, and thus does not advance the conversation. But since you insist, I will rephrase my question:
Would someone who shares your views on consciousness but doesn’t give a crap about other people say “yes” or “no” to my deal?
Sorry if my attempt at coloring the conversation with humor upset you. That was not my intent. However, you will find it did nothing to alter the content of our discourse. You have changed your question. The question you ask now is not the question you asked previously.
Previous question: No, I do not choose to murder trillions of sentient me-copies for personal gain. I added an addendum, to provide you with further information, perhaps presuming a future question: Neither would I murder trillions of sentient not-me copies.
New question: Yes, an amoral dick who shares my views on consciousness would say yes.
Care in terms of what? You have no way of knowing which one you are, so if you’re offered the option to help the one in the left room, you should, because there’s a 50% chance that’s you. I would say it’s not well defined whether you’re one or the other, actually, you’re both until an “observation/divergence”. But what specific decision hinges on the question?
All your arguments really prove is that if your copy diverges from you, it’s not you anymore. But that’s only because once something happens to your copy but not to you, you know which one you are. The import of “you have no way of knowing which copy you are” disappears. Conversely, if you don’t know which one you are, then both must be your consciousness, because you know you are conscious.
Edit: the last point is not strictly rigorous, you could know that one is conscious but not know which, but it seems to me that if you know every relevant detail of both are equal, and don’t know which you are, then they both must be conscious (anti-zombie principle, whatever) and since you can’t tell which you are, there’s a sense in which you’re “both”. That probably has subtle objections, but nothing that bothers me right now. If anyone wants to argue against that, I’d be interested; I just didn’t think this post was really doing that, based on the examples given where the copy diverges.
Thanks for the reply. To your last point, I am not speaking of zombies. Every copy I discussed above is assumed to have its own consciousness. To your first points, at no time is there any ambiguity or import to the question of “which one I am”. I know which one I am, because here I am, in the meat (or in my own sim, it makes no difference). When a copy is made its existence is irrelevant, even if it should live in a perfect sim and deviate not one bit, I do not experience what it experiences. It is perhaps identical or congruent to me. It is not me.
My argument is, boiled down: That your transhuman copy is of questionable value to your meat self. For the reasons stated above (chiefly that “You” are the result of activity in a specific brain), fuck that guy. You don’t owe him an existence. If you that are reading this ever upload with brain destruction, you will have committed suicide. If you upload without brain destruction you will live the rest of your meat life and die. If you brain-freeze, something perfectly you-like will live after you die with zero effect on you.
I stand by that argument, but, this being a thorny issue, I have received a lot of great feedback to think on.
Can you explain how you know that you’re the meat space one? If every observation you make is made by the other one, and both are conscious, how do you know you’re not a sim? This is a purely epistemic question.
I’m perfectly happy saying ” if I am meat, then fuck sim-me, and if I am sim, fuck meat me” (assuming selfishness). But if you don’t know which one you are, you need to act to benefit both, because you might be both.
On the other hand, if you see the other one, there’s no problem fighting it, because the one you’re fighting is surely not you. (But even so, if you expect them to do the same as you, then you’re in a perfect prisoner dilemma and should cooperate.)
On the other hand, I think that if I clone myself, then do stuff my clone doesn’t do, I’d still be less worried about dying than if I had no clone. I model that as “when you die, some memories are wiped and you live again”. If you concede that wiping a couple days of memory doesn’t destroy the person, then I think that’s enough for me. Probably not you, though. What’s the specific argument in that case?
Meat or sim or both meat aren’t issues as I see it. I am self, fuck non-self, or not to be a dick, value non-self less than self, certainly on existential issues. “I” am the awareness within this mind. “I” am not memory, thought, feeling, personality, etc. I know I am me ipso facto as the observer of the knower. I don’t care if I was cloned yesterday or one second ago and there are many theoretical circumstances where this could be the case. I value the “I” that I currently am just exactly now. I don’t believe that this “I” is particularly changeable. I fear senility because “I” am the entity which will be aware of the unpleasant thoughts and feeling associated with the memory loss and the fear of worsening status and eventual nightmarish half-life of idiocy. That being will be unrecognizable as me on many levels but it is me whereas a perfect non-senile copy is not me, although he has the experience of feeling exactly as I would, including the same stubborn ideas about his own importance over any other copies or originals.
I don’t know what you mean by that.
Why can’t a perfect copy be you? Doesn’t that involve epiphenomenalism? Even if I give the entire state of the world X time in the future, I’d also need to specify which identical beings are “you”.
It’s a sticky topic, consciousness. I edited my post to clarify further:
I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding.
Which I recognize might sound a bit mystical to some, but I believe it is a real thing which is a function of brain activity.
As a function of brain (or whatever processing medium) consciousness or self is tied to matter. The consciousness in the matter that is experiencing this consciousness is me. I’m not sure if any transfer to alternate media is possible. The same matter can’t be in two different places. Therefore every consciousness is a unique entity, although identical ones can exist via copying. I am the one aware of this mind as the body is typing. You are the one aware of the mind reading it. Another might have the same experience but that won’t have any intrinsic value to You or I.
If I copy myself and am destroyed in the process, is the copy me? If I copy myself and am not destroyed, are the copy and the original both me? If I am a product of brain function (otherwise I am a magical soul) and if both are me then my brain is a single set of matter in two locations. Are they We? That gets interesting. Lots to think about but I stand with my original position.
If I gradually replace every atom in your brain with a different one until you have no atoms left, but you still function the same, are you still “you”? If not, at what point did you stop?
Have you seen Yudkowsky’s series of posts on this?
I’m familiar with the concept, not his specific essays, and, indeed, this literally does happen. Our neurons are constantly making and unmaking largely proteinaceous components of themselves and, over a lifetime, there is a no doubt partial, perhaps complete, turnover of the brain’s atoms. I find this not problematic at all because the electrochemical processes which create consciousness go on undisturbed. The idea that really queers my argument for me is that of datum-by-datum transfer, each new datum, in the form of neuronal activity (the 1s and 0s of electrical discharge or non-) is started by my current brain but saved in another. Knee-jerk, I would tend to say that transfer is a complete one and myself has been maintained. The problem comes when a copy, run in parallel until completed and then cloven (good luck untying that knot), rather than a transfer is made by the exact same datum-by-datum process. At the end I have 2 beings who seem to meet my definition of Me.
However, this argument does not convince me of the contrary position of the sameness of self and copy, and it does nothing to make me care about a me-like copy coming into being a thousand years from now, and does not induce me to step up onto Dr Bowie-Tesla’s machine.
At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago? How about one who appears 1,000,000 years from now? The exact eyes that see these words will be the ones in the water. I can’t come up with any answer other that “fuck that guy”. I might just be a glass-half empty kind of guy, but someone always ends up stuck in the meat, and it’s going to be that being which remains behind these eyes.
Can you read http://lesswrong.com/lw/qp/timeless_physics/, http://lesswrong.com/lw/qx/timeless_identity/, and http://lesswrong.com/lw/qy/why_quantum/, with any relevant posts linked therein? (Or just start at the beginning of the quantum sequence.)
Note that you can believe everyone involved is “you”, and yet not care about them. The two questions aren’t completely orthogonal, but identifying someone with yourself doesn’t imply you should care about them.
The same price I would accept to have the last second erased from my memory, but first feel the pain of drowning. That’s actually not so easy to set. I’m not sure how much it would cost to get me to accept X amount of pain plus removal of the memory, but it’s probably less than the cost for X amount of pain alone.
That’s like removing the last second of memory, plus pain, plus jumping forward in time. I’d probably only do it if I had a guarantee that I’d survive and be able to get used to whatever goes on in the future and be happy.
“I model that as “when you die, some memories are wiped and you live again”. If you concede that wiping a couple days of memory doesn’t destroy the person, then I think that’s enough for me. Probably not you, though. What’s the specific argument in that case?”
I think I must have missed this part before. Where I differ is in the idea that a copy is “me” living again, I don’t accept that it is, for the reasons previously written. Whether or not a being with a me-identical starting state lives on after I die might be the tiniest of solaces, like a child or a well-respected body of work, but in no way is it “me” living on in any meaningful way that I recognize. I get the exact opposite take on this, but I agree even with a stronger form of your statement to say that “ALL memories are wiped and you live again” (my conditions would require this to read “you continue to live”) is marginally more desirable than “you die and that’s it”. Funny about that.
So continuity of consciousness can exist outside of memories? How so? Why is memory-wiped you different than any random memory-wiped person? How can physical continuity do that?
I see factual memory as a highly changeable data set that has very little to do with “self”. As I understand it (not an expert in neuroscience or psychiatry, but experience working with neurologically impaired people) the sort of brain injuries which produce amnesia are quite distinct from those that produce changes in personality, as reported by significant others, and vice versa. In other words, you can lose the memories of “where you came from” and still be recognized as very much the same person by those who knew you, while you can become a very different person in terms of disposition, altered emotional response to identical stimuli relative to pre-injury status, etc (I’m less clear on what constitutes “personality”, but it seems to be more in line with people’s intuitive concept of “self”) with fully intact memories. The idea of a memory wipe and continued existence is certainly a “little death” to my thinking, but marginally preferable to actual death. My idea of consciousness is one of passive reception. The same “I”, or maybe “IT” is better, is there post memory wipe.
If memory is crucial to pattern identity then which has the greater claim to identity: The amnesiac police officer, or his 20 years of dashcam footage and activity logs?
Yes or no, will those who knew them be able to pick them out blind out of a group going only on text-based communication? If not, what do you mean by recognize? (If yes, I’ll be surprised and will need to reevaluate this.)
The officer can’t work if they’re completely amnesiac. They can’t do much of anything, in fact.
As to your main point: it’s possible that personality changes remain after memory loss, but those personalities are themself caused by experiences and memories. I suppose I was assuming that memory wiped would wash away any recognizable personality. I still do. The kinds of amnesia you’re referring to presumably leave traces of the memory somewhere in the brain, which then affects the brain’s outputs. Unless we can access the brain directly and wipe it ourself, we can’t guarantee everything was forgotten, and it probably does linger on in the subconscious; so that’s not the same as an actual memory wipe.
I believe there is a functional definition of amnesia, loss of factual memory, life skills remain intact. I guess I would call what you are calling a memory wipe a “brain wipe”. I guess I’d call what you are calling memory “total brain content”. If a brain is wiped of all content in the forest is Usul’s idea of consciousness spared? No idea. Total brain reboot? I’d say yes and call that good as dead I think.
I would say probably yes to the text only question. Again, loss of factual memory. But I don’t rate that as a reliable or valid test in this context.
OK imagine somewhere far away in the universe—or maybe one room over, of doesn’t matter—there is an exact physical replica of you that is also through some genius engineering being provided the exact same percepts (sight, hearing, touch, etc.) that you do. It’s mental states remain exactly identical to yours.
Should you still care? To me it’d still be someone different.
Suppose I offer you a dollar in return for making a trillion virtual copies of you and shooting them all with a gun, with the promise that I won’t make any copies until after you agree. Since the copies haven’t been made yet, this ensures that you must be the original, and since you don’t care about any identical copies of yours since they’re technically different people from you, you happily agree. I nod, pull out a gun, and shoot you.
(In the real universe—or at least the universe one level up on the simulation hierarchy—a Mark Friedenbach receives a dollar. This isn’t of much comfort to you, of course, seeing as you’re dead.)
You shouldn’t murder sentient beings or cause them to be murdered by trillions. Both are generally considered dick moves. Shame on you both. My argument: a benefit to an exact copy is of no intrinsic benefit to a different copy or original. Unless some Omega starts playing evil UFAI games with them. One trillion other copies are unaffected by this murder. Original or copy is irrelevant. It is the being we are currently discussing that is relevant. If I am the original I care about myself. If I am a copy I care about myself. Whether or not I even care if I’m a copy or not depends on various aspects of my personality.
If I offered you the same deal I offered to Mark Friedenbach, would you agree? (Please answer with “yes” or “no”. You’re free to expand on your answer, but first please make sure you give an answer.)
No. It’s a dick move. Same question and they’re not copies of me? Same answer.
As I’m sure you’re aware, the purpose of these thought experiments is to investigate what exactly your view of consciousness entails from a decision-making perspective. The fact that you would have given the same answer even if the virtual instances weren’t copies of you shows that your reason for saying “no” has nothing to do with the purpose of the question. In particular, telling me that “it’s a dick move” does not help elucidate your view of consciousness and self, and thus does not advance the conversation. But since you insist, I will rephrase my question:
Would someone who shares your views on consciousness but doesn’t give a crap about other people say “yes” or “no” to my deal?
Sorry if my attempt at coloring the conversation with humor upset you. That was not my intent. However, you will find it did nothing to alter the content of our discourse. You have changed your question. The question you ask now is not the question you asked previously.
Previous question: No, I do not choose to murder trillions of sentient me-copies for personal gain. I added an addendum, to provide you with further information, perhaps presuming a future question: Neither would I murder trillions of sentient not-me copies.
New question: Yes, an amoral dick who shares my views on consciousness would say yes.
No, I don’t want you to murder a trillion people, even if those people are not me.
Care in terms of what? You have no way of knowing which one you are, so if you’re offered the option to help the one in the left room, you should, because there’s a 50% chance that’s you. I would say it’s not well defined whether you’re one or the other, actually, you’re both until an “observation/divergence”. But what specific decision hinges on the question?