I’m not sure I follow your objection here, but my best guess is something like “the upload can’t be me, because I’m experiencing a thousand years of agony, and the upload isn’t.”
Is that even close to right?
I won’t presume to speak for the LW consensus, but personally I would say that the upload is me, and the body is also me. When the body dies in the cataclysm, I have died, and I’ve also survived. This sounds paradoxical because I’m used to thinking of my identity as traveling along only one track, but in the case you’re describing Omega’s device has made that no longer true, and in that case I need to stop thinking as though it were.
I am not sure whether either of me, after pressing the button, considers the other me to be them… but I suspect probably not.
Does any of that help?
Oh, and, yes, I press the button. Shortly after pressing it, I both deeply regret having pressed it, and am enormously grateful to myself for having pressed it.
Oh, and, yes, I press the button. Shortly after pressing it, I both deeply regret having pressed it, and am enormously grateful to myself for having pressed it.
Shortly after pressing it both of me are grateful to myself for having pressed it. I do still consider the other me to be me. It is only after the agony starts to completely strip away my conscious identity and rational thought that I start to experience the regret. Although I suspect even the state of regret wouldn’t last long. Regret is a relatively high level emotion, one that would be completely overwhelmed and destroyed by the experience of pain and the desperate, incoherent desire for it to stop.
At the risk of utter digression, I’m interested in this question of considering the other me to be me, post-split.
The way I experience identity clearly treats the results of various possible future branchpoints as roughly equivalent to one another (and equivalently associated to “me”), but does not treat the results of past branchpoints that way. A decision that has yet to be made feels very different from one that has already been made.
Normally it doesn’t make much difference—I don’t have much difficulty treating the “me” that put on a different shirt this morning in some other Everett branch as sharing my identity, despite the branchpoint in the past, because we’re so awfully similar—but when we start introducing vast differences in experience, my ability to extend my notion of identity to include them proves inadequate.
The timeless approach you describe strikes me as a useful way of experiencing identity, but I can’t imagine actually experiencing identity that way.
Is this perspective something that seems intuitively true to you, or is it something you’ve trained (and if so how?), or is it more that you are describing your intellectual rather than your emotional beliefs, or …?
Just to be clear: this is entirely a question about human psychology; I’m not asking about the “actual nature of identity” out in the world (whatever that even means, if indeed it means anything at all).
Is this perspective something that seems intuitively true to you, or is it something you’ve trained (and if so how?), or is it more that you are describing your intellectual rather than your emotional beliefs, or …?
It does seem like something that is intuitively true. I suspect having spent a lot of time considering bizarre duplication based counterfactuals has had some influence on my intuitions, bringing the intellectual and emotional beliefs somewhat closer together.
Also note that the emotion experience of identifying as ‘me’ isn’t an all or nothing question. Even in everyday experience the extent to which I self identify as ‘me’ does vary—although always in the high ranges. Which parts are me? comes in to it here. So would experimenting with localized magnetic stimulation of certain parts of the brain, if you really looked at the science!
Note that I (guess I) would not continue to identify with the other me as me indefinitely. It would probably go from like looking at a mirror (an abstracted intellectual one in this example) to only a vague feeling of association over time and depending on stimulus.
In the other direction there are definitely parts of my past history that I don’t experience as ‘me’ either—and not purely dependent on time. There are a couple of memories from when I was 5 that feel like me but some from even my twenties (I am less than thirty) that barely feel like me at all.
I compare this to the experience of turning into a vampire in Alicorn’s luminosity fanfiction. (FYI: That means a couple of days of extreme pain that does not cause any permanent damage.) While being tortured I may not feel all that much identification with either pre-torture human me or post torture vampire me. As vamp-wedrifid I would (probably) feel a somewhat higher identification with past-human-wedrifid as being ‘myself’. Say ballpark 80%. From the perspective of painful-half-turned-wedrifid the main difference in experience from the me in this Omega counterfactual would be the anticipation of being able to remember the torture as opposed to not. Knowing the way the time forks are set up It would make a little difference but not all that much.
Summary: Yes, the timeless perspective relates to actual anticipated experience not just intellectual abstraction.
I’ve been thinking about this some more, and I’d like to consult your intuitions on some related questions, if you don’t mind.
Suppose I come along at T1 and noninvasively copy you into a form capable of effectively hosting everything important about you. (E.g., a software upload, or a clone body, or whatever it takes.) I don’t tell either of you about the other’s existence.
Let’s label the resulting wedrifids W1 and W2 for convenience. (Labels randomly assigned to the post-copy yous.)
I then at T2 convert W2 into a chunk of pure orgasmium (O).
If I’ve understood your view, you would say that at T2, W1 undergoes a utility change (equal to [value(W2) - value(O)]), though of course W1 is unaware of the fact. Yes?
Whereas in an alternative scenario where at T2 I create a chunk of orgasmium (O2) out of interstellar hydrogen, without copying you first, W1 (which is uniquely you) doesn’t experience any utility change at all at T2. Yes?
Feel free to replace the orgasmium with anything else… rocks, a different person altogether, a puppy, W2 experiencing a thousand years of torture, etc. …. if that changes your intuitions.
As situations become harder to imagine in an tangible sense it becomes harder extrapolate from intuitions meaningfully. But I can give some response in this case.
I then at T2 convert W2 into a chunk of pure orgasmium (O).
If I’ve understood your view, you would say that at T2, W1 undergoes a utility change (equal to [value(W2) - value(O)]), though of course W1 is unaware of the fact. Yes?
Utility functions operate over entire configuration states of the universe—values of objects or beings in the universe can not by default be added or subtracted. Crudely speaking W1 undergoes a utility change of value(universe has W1, O) - value(universe has W1, W2). The change would be significant—clones have value. And this is the first clone. Transforming a hypothetical W534 into orgasmium would be a far, far lesser loss.
Whereas in an alternative scenario where at T2 I create a chunk of orgasmium (O2) out of interstellar hydrogen, without copying you first, W1 (which is uniquely you) doesn’t experience any utility change at all at T2. Yes?
It is worth elaborating here that the states of the universe that utility is evaluated on are timeless. The entire wave equation gets thrown in, not just a state at a specific time. This means [W1, hydrogen → W1, W2 → W1, O] can be preferred over [W1, hydrogen → W1, O], or anti-preferred as appropriate without it being an exceptional case. This is matches the intuitions most people have in everyday use—it just formulates it coherently.
In this case W1 does not seem to care all that much about what happened at T2. Maybe a little. Orgasmium sounds kind of more interesting to have around than hydrogen.
Also note that the transition at T1 leaves W2′s utility function at a high percentage of W1′s—although W2 definitely doesn’t know about it!
Totally agreed about identifying-as-me being a complex thing, and looking at brain science contributing to it. Actually, when I first encountered the concept of blindsight as an undergraduate, it pretty much destroyed my intuition of unique identity.
I guess I just haven’t thought carefully enough about self-duplication scenarios.
I’m not sure I follow your objection here, but my best guess is something like “the upload can’t be me, because I’m experiencing a thousand years of agony, and the upload isn’t.”
Is that even close to right?
I won’t presume to speak for the LW consensus, but personally I would say that the upload is me, and the body is also me. When the body dies in the cataclysm, I have died, and I’ve also survived. This sounds paradoxical because I’m used to thinking of my identity as traveling along only one track, but in the case you’re describing Omega’s device has made that no longer true, and in that case I need to stop thinking as though it were.
I am not sure whether either of me, after pressing the button, considers the other me to be them… but I suspect probably not.
Does any of that help?
Oh, and, yes, I press the button. Shortly after pressing it, I both deeply regret having pressed it, and am enormously grateful to myself for having pressed it.
Shortly after pressing it both of me are grateful to myself for having pressed it. I do still consider the other me to be me. It is only after the agony starts to completely strip away my conscious identity and rational thought that I start to experience the regret. Although I suspect even the state of regret wouldn’t last long. Regret is a relatively high level emotion, one that would be completely overwhelmed and destroyed by the experience of pain and the desperate, incoherent desire for it to stop.
At the risk of utter digression, I’m interested in this question of considering the other me to be me, post-split.
The way I experience identity clearly treats the results of various possible future branchpoints as roughly equivalent to one another (and equivalently associated to “me”), but does not treat the results of past branchpoints that way. A decision that has yet to be made feels very different from one that has already been made.
Normally it doesn’t make much difference—I don’t have much difficulty treating the “me” that put on a different shirt this morning in some other Everett branch as sharing my identity, despite the branchpoint in the past, because we’re so awfully similar—but when we start introducing vast differences in experience, my ability to extend my notion of identity to include them proves inadequate.
The timeless approach you describe strikes me as a useful way of experiencing identity, but I can’t imagine actually experiencing identity that way.
Is this perspective something that seems intuitively true to you, or is it something you’ve trained (and if so how?), or is it more that you are describing your intellectual rather than your emotional beliefs, or …?
Just to be clear: this is entirely a question about human psychology; I’m not asking about the “actual nature of identity” out in the world (whatever that even means, if indeed it means anything at all).
It does seem like something that is intuitively true. I suspect having spent a lot of time considering bizarre duplication based counterfactuals has had some influence on my intuitions, bringing the intellectual and emotional beliefs somewhat closer together.
Also note that the emotion experience of identifying as ‘me’ isn’t an all or nothing question. Even in everyday experience the extent to which I self identify as ‘me’ does vary—although always in the high ranges. Which parts are me? comes in to it here. So would experimenting with localized magnetic stimulation of certain parts of the brain, if you really looked at the science!
Note that I (guess I) would not continue to identify with the other me as me indefinitely. It would probably go from like looking at a mirror (an abstracted intellectual one in this example) to only a vague feeling of association over time and depending on stimulus.
In the other direction there are definitely parts of my past history that I don’t experience as ‘me’ either—and not purely dependent on time. There are a couple of memories from when I was 5 that feel like me but some from even my twenties (I am less than thirty) that barely feel like me at all.
I compare this to the experience of turning into a vampire in Alicorn’s luminosity fanfiction. (FYI: That means a couple of days of extreme pain that does not cause any permanent damage.) While being tortured I may not feel all that much identification with either pre-torture human me or post torture vampire me. As vamp-wedrifid I would (probably) feel a somewhat higher identification with past-human-wedrifid as being ‘myself’. Say ballpark 80%. From the perspective of painful-half-turned-wedrifid the main difference in experience from the me in this Omega counterfactual would be the anticipation of being able to remember the torture as opposed to not. Knowing the way the time forks are set up It would make a little difference but not all that much.
Summary: Yes, the timeless perspective relates to actual anticipated experience not just intellectual abstraction.
I’ve been thinking about this some more, and I’d like to consult your intuitions on some related questions, if you don’t mind.
Suppose I come along at T1 and noninvasively copy you into a form capable of effectively hosting everything important about you. (E.g., a software upload, or a clone body, or whatever it takes.) I don’t tell either of you about the other’s existence.
Let’s label the resulting wedrifids W1 and W2 for convenience. (Labels randomly assigned to the post-copy yous.)
I then at T2 convert W2 into a chunk of pure orgasmium (O).
If I’ve understood your view, you would say that at T2, W1 undergoes a utility change (equal to [value(W2) - value(O)]), though of course W1 is unaware of the fact. Yes?
Whereas in an alternative scenario where at T2 I create a chunk of orgasmium (O2) out of interstellar hydrogen, without copying you first, W1 (which is uniquely you) doesn’t experience any utility change at all at T2. Yes?
Feel free to replace the orgasmium with anything else… rocks, a different person altogether, a puppy, W2 experiencing a thousand years of torture, etc. …. if that changes your intuitions.
As situations become harder to imagine in an tangible sense it becomes harder extrapolate from intuitions meaningfully. But I can give some response in this case.
Utility functions operate over entire configuration states of the universe—values of objects or beings in the universe can not by default be added or subtracted. Crudely speaking W1 undergoes a utility change of value(universe has W1, O) - value(universe has W1, W2). The change would be significant—clones have value. And this is the first clone. Transforming a hypothetical W534 into orgasmium would be a far, far lesser loss.
It is worth elaborating here that the states of the universe that utility is evaluated on are timeless. The entire wave equation gets thrown in, not just a state at a specific time. This means [W1, hydrogen → W1, W2 → W1, O] can be preferred over [W1, hydrogen → W1, O], or anti-preferred as appropriate without it being an exceptional case. This is matches the intuitions most people have in everyday use—it just formulates it coherently.
In this case W1 does not seem to care all that much about what happened at T2. Maybe a little. Orgasmium sounds kind of more interesting to have around than hydrogen.
Also note that the transition at T1 leaves W2′s utility function at a high percentage of W1′s—although W2 definitely doesn’t know about it!
Huh.
I’m not sure I even followed that. I’ll have to stare at it a while longer. Thanks again for a thoughtful reply.
Neat. I’m somewhat envious.
Totally agreed about identifying-as-me being a complex thing, and looking at brain science contributing to it. Actually, when I first encountered the concept of blindsight as an undergraduate, it pretty much destroyed my intuition of unique identity.
I guess I just haven’t thought carefully enough about self-duplication scenarios.
Thanks for the thoughtful reply.