You are technically correct, but I don’t think that’s the best kind of correct.
You could decide that future you isn’t you if it’s wearing a blue hat, but that would make no sense at all. So I don’t think logic is out of play here. Technically those could be your values, but it would be really remarkable to find someone that genuinely held those values.
Threaten someone today, then tomorrow tell them you’re a different person, because that’s what your values say. They’ll look at you funny and go on distrusting and disliking you.
If you threaten someone, then go through a perfect duplication process, they’ll dislike and distrust both of you. I doubt you’ll find this relevant, but just to note that both perfect clones are you from an outside perspective. Almost no one would disagree and trust and like the clone that happened to be the duplicate, not the original.
“But which am I, really?” you ask again. I submit that the you of today is logically both, or neither, depending on your definition. Your conscious experience of right now does not “have” the conscious experience of you tomorrow. Each moment of consciousness is linked only by memories, beliefs, and having very similar patterns. No two moments of consciousness are the same; when we say they’re the same individual, that’s all we mean.
This sounds absurd. Of course the you of tomorrow is you, and a clone is not you.
But people can and do treat the them of tomorrow as not-them. The most dramatic is extreme short-term decision-making, as in the meme where someone leaves a note for drunk them to drink water and eat food, and in the morning find a note back from drunk-them saying “fuck you, morning self!”. Other I’ll-pay-for-this-later decisions are partly in the same boat.
So, which perfect clone is you tomorrow? Both if you take a standard view of identity and why to care about future you. Neither if you don’t. But choosing one as “you” and not the other is just like disowning future you if they wear a blue hat.
I think this discussion is focusing on what other’s would behave towards me, and derive what ought to be regarded as my future self from there. That is certainly a valid discussion to be had. However my post is taking about a different (thought related) topic.
For example, if I for whatever crazy reason thinks that me from tomorrow:—the one with (largely) the same physical body and no trick on memory whatsoever— not my future self. Then I would do a bunch of irresponsible things that would lead to others’ dislike or hostility toward me that could eventually lead to my demise. But so what? If I regard that as a different person, then to hell with him. The current me wouldn’t even care. So being detrimental to that future person would not compel current me to regard him as my future self.
Luckily we do not behave that way. Everyone, rational ones at least, considers the person with the same physical body and memories of the current self as themself in the future. That is the survival instinct, that is the consensus.
But that consensus is about an idiosyncratic situation: where memory(experience) and physical body are bound together. Take that away, and we no longer have a clear, unequivocal basis to start a logical discussion. Someone’s basis could be the survival of the same physical body would not step into the teletranporter, even if it is greatly convenient and could benefit the one steps out of it. Someone else could start from a different basis. They may believe that only patterns matters. So mind uploading into a silicon machine to make easy copies at the cost of adversely affecting the carbon body would be welcomed. None of these positions could be rebutted by the cost/benefit analysis of some future minds. Because they may or may not care about those minds, at different levels, in the first place.
Sure, logic is not entirely irrelevant. It comes into play after you pick the basis of your decision. But values, instead of logic, largely determines the answer to the question.
It’s probably relevant to think about why we tend to value our future selves in the first place. I think it’s that each of us has memories (and the resulting habits) of thinking “wow, past self really screwed me over. I hate that. I think I’ll not screw future self over so that doesn’t happen again”. We care because there’s a future self that will hate us if we do, and we can imagine it very vividly. In addition, there’s an unspoken cultural assusmption that it’s logical to care about our future selves.
I included some of how other people regard our identity, but that’s not my point. My point is that, for almost any reason whatsoever you could come up with to value your physically continuous future self, you’d also value a physically discontinuous future self that maintains the same mind-pattern. That’s except for deciding “I no longer care about anything that teleports”, which is possible and consistent, but no more sensible than stopping caring about anything wearing blue hats.
So sure, people aren’t necessarily logically wrong if they value their physically continuous future self over a perfect clone (or upload). But they probably are making a logic error, if they have even modestly consistent values.
Have you seen Relativity Theory for What the Future ‘You’ Is and Isn’t? It’s making pretty much exactly the same point, in response to the same post. I didn’t get around to commenting there, so here we are.
You are technically correct, but I don’t think that’s the best kind of correct.
You could decide that future you isn’t you if it’s wearing a blue hat, but that would make no sense at all. So I don’t think logic is out of play here. Technically those could be your values, but it would be really remarkable to find someone that genuinely held those values.
Threaten someone today, then tomorrow tell them you’re a different person, because that’s what your values say. They’ll look at you funny and go on distrusting and disliking you.
If you threaten someone, then go through a perfect duplication process, they’ll dislike and distrust both of you. I doubt you’ll find this relevant, but just to note that both perfect clones are you from an outside perspective. Almost no one would disagree and trust and like the clone that happened to be the duplicate, not the original.
“But which am I, really?” you ask again. I submit that the you of today is logically both, or neither, depending on your definition. Your conscious experience of right now does not “have” the conscious experience of you tomorrow. Each moment of consciousness is linked only by memories, beliefs, and having very similar patterns. No two moments of consciousness are the same; when we say they’re the same individual, that’s all we mean.
This sounds absurd. Of course the you of tomorrow is you, and a clone is not you.
But people can and do treat the them of tomorrow as not-them. The most dramatic is extreme short-term decision-making, as in the meme where someone leaves a note for drunk them to drink water and eat food, and in the morning find a note back from drunk-them saying “fuck you, morning self!”. Other I’ll-pay-for-this-later decisions are partly in the same boat.
So, which perfect clone is you tomorrow? Both if you take a standard view of identity and why to care about future you. Neither if you don’t. But choosing one as “you” and not the other is just like disowning future you if they wear a blue hat.
I think this discussion is focusing on what other’s would behave towards me, and derive what ought to be regarded as my future self from there. That is certainly a valid discussion to be had. However my post is taking about a different (thought related) topic.
For example, if I for whatever crazy reason thinks that me from tomorrow:—the one with (largely) the same physical body and no trick on memory whatsoever— not my future self. Then I would do a bunch of irresponsible things that would lead to others’ dislike or hostility toward me that could eventually lead to my demise. But so what? If I regard that as a different person, then to hell with him. The current me wouldn’t even care. So being detrimental to that future person would not compel current me to regard him as my future self.
Luckily we do not behave that way. Everyone, rational ones at least, considers the person with the same physical body and memories of the current self as themself in the future. That is the survival instinct, that is the consensus.
But that consensus is about an idiosyncratic situation: where memory(experience) and physical body are bound together. Take that away, and we no longer have a clear, unequivocal basis to start a logical discussion. Someone’s basis could be the survival of the same physical body would not step into the teletranporter, even if it is greatly convenient and could benefit the one steps out of it. Someone else could start from a different basis. They may believe that only patterns matters. So mind uploading into a silicon machine to make easy copies at the cost of adversely affecting the carbon body would be welcomed. None of these positions could be rebutted by the cost/benefit analysis of some future minds. Because they may or may not care about those minds, at different levels, in the first place.
Sure, logic is not entirely irrelevant. It comes into play after you pick the basis of your decision. But values, instead of logic, largely determines the answer to the question.
I agree with all of that.
It’s probably relevant to think about why we tend to value our future selves in the first place. I think it’s that each of us has memories (and the resulting habits) of thinking “wow, past self really screwed me over. I hate that. I think I’ll not screw future self over so that doesn’t happen again”. We care because there’s a future self that will hate us if we do, and we can imagine it very vividly. In addition, there’s an unspoken cultural assusmption that it’s logical to care about our future selves.
I included some of how other people regard our identity, but that’s not my point. My point is that, for almost any reason whatsoever you could come up with to value your physically continuous future self, you’d also value a physically discontinuous future self that maintains the same mind-pattern. That’s except for deciding “I no longer care about anything that teleports”, which is possible and consistent, but no more sensible than stopping caring about anything wearing blue hats.
So sure, people aren’t necessarily logically wrong if they value their physically continuous future self over a perfect clone (or upload). But they probably are making a logic error, if they have even modestly consistent values.