Well, OK. So suppose that, after I go through that transporter/replicator, you ask the entity that comes out whether it has the belief, real or illusory, that it is the same person in this moment that it was at the moment it walked into the machine, and it says “yes”.
If personal identity is what creates that belief, and that entity has that belief, it follows that that entity shares my personal identity… doesn’t it?
Not quite. If You!Mars gave it thought before answering, his thinking probably went like this: “I have memories of going into the transporter, just a moment ago. I have a continuous sequence of memories, from then until now. Nowhere in those memories does my sense of self change. Right now I am experiencing the same sense of self I always remember experiencing, and laying down new memories. Ergo, proof by backwards induction I am the same person that walked into the teleporter.” However for that—or any—line of meta reasoning to hold, (1) your memories need to accurately correspond with the true and full history of reality and (2) you need trust that what occurs in the present also occurred in the past. In other words, it’s kinda like saying “my memory wasn’t altered because I would have remembered that.” It’s not a circular argument per se, but it is a meta loop.
The map is not the territory. What happened to You!Earth’s subjective experience is an objective, if perhaps not empirically observable fact. You!Mars’ belief about what happened may or may not correspond with reality.
What if me!Mars, after giving it thought, shakes his head and says “no, that’s not right. I say I’m the same person because I still have a sense of subjective experience, which is separate from memories or shared history, which gives me the belief, real or illusory, that I am the same person from moment to moment, day to day, and which separates me from my clones”?
Do you take his word for it? Do you assume he’s mistaken? Do you assume he’s lying?
Assuming that he acknowledges that clones have a separate identity, or in other words he admits that there can be instances of himself that are not him, then by asserting the same identity as the person that walked into the teleporter, he is making an extrapolation into the past. He is expressing a belief that by whatever definition he is using the person walking into the teleporter meets a standard of meness that the clones do not. Unless the definition under consideration explicitly reference You!Mars’ mental state (e.g. “by definition” he has shared identity with people he remembers having shared identity with), then the validity of that belief is external: it is either true or false. The map is not the territory.
Under an assumption of pattern or causal continuity, for example, it would be explicitly true. For computational continuity it would be false.
If I understood you correctly, then on your account, his claim is simply false, but he isn’t necessarily lying.
Yes?
It seems to follow that he might actually have a sense of subjective experience, which is separate from memories or shared history, which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.
If I understood you correctly, then on your account, his claim is simply false, but he isn’t necessarily lying.
Yes, in the sense that it is a belief about his own history which is either true or false like any historical fact. Whether it actually false depends on the nature of “personal identity”. If I understand the original post correctly, I think Eliezer would argue that his claim is true. I think Eliezer’s argument lacks sufficient justification, and there’s a good chance his claim is false.
It seems to follow that he might actually have a sense of subjective experience, which is separate from memories or shared history, which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.
Yes. My question is: is that belief justified?
If your memory were altered such to make you think you won the lottery, that doesn’t make you any richer. Likewise You!Mars’ memory was constructed by the transporter machine in such a way, following the transmitted design as to make him remember stepping into the transporter on Earth as you did, and walking out of it on Mars in seamless continuity. But just because he doesn’t remember the deconstruction, information transmission, and reconstruction steps doesn’t mean they didn’t happen. Once he learns what actually happened during his transport, his decision about whether he remains the same person that entered the machine on Earth depends greatly on his model of consciousness and personal identity/continuity.
It seems to follow that he might actually have a sense of subjective experience, which is separate from memories or shared history, which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones. Yes. My question is: is that belief justified?
That sense of subjective experience separate from memories or shared history is what I have been calling “personal identity.” It is what gives me the belief, real or illusory, that I am the same person from moment to moment, day to day, and what separates me from my clones.
And yet, here’s Dave!Mars, who has a sense of subjective experience separate from memories or shared history which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.
But on your account, he might not have Dave’s personal identity.
So, where is this sense of subjective experience coming from, on your account? Is it causally connected to personal identity, or not?
Once he learns what actually happened during his transport, his decision about whether he remains the same person that entered the machine on Earth depends greatly on his model of consciousness and personal identity/continuity.
Yes, that’s certainly true. By the same token, if I convince you that I placed you in stasis last night for… um… long enough to disrupt your personal identity (a minute? an hour? a millisecond? a nanosecond? how long a period of “computational discontinuity” does it take for personal identity to evaporate on your account, anyway?), you would presumably conclude that you aren’t the same person who went to bed last night. OTOH, if I placed you in stasis last night and didn’t tell you, you’d conclude that you’re the same person, and live out the rest of your life none the wiser.
Not quite. If You!Mars gave it thought before answering, his thinking probably went like this: “I have memories of going into the transporter, just a moment ago. I have a continuous sequence of memories, from then until now. Nowhere in those memories does my sense of self change. Right now I am experiencing the same sense of self I always remember experiencing, and laying down new memories. Ergo, proof by backwards induction I am the same person that walked into the teleporter.” However for that—or any—line of meta reasoning to hold, (1) your memories need to accurately correspond with the true and full history of reality and (2) you need trust that what occurs in the present also occurred in the past. In other words, it’s kinda like saying “my memory wasn’t altered because I would have remembered that.” It’s not a circular argument per se, but it is a meta loop.
The map is not the territory. What happened to You!Earth’s subjective experience is an objective, if perhaps not empirically observable fact. You!Mars’ belief about what happened may or may not correspond with reality.
What if me!Mars, after giving it thought, shakes his head and says “no, that’s not right. I say I’m the same person because I still have a sense of subjective experience, which is separate from memories or shared history, which gives me the belief, real or illusory, that I am the same person from moment to moment, day to day, and which separates me from my clones”?
Do you take his word for it?
Do you assume he’s mistaken?
Do you assume he’s lying?
Assuming that he acknowledges that clones have a separate identity, or in other words he admits that there can be instances of himself that are not him, then by asserting the same identity as the person that walked into the teleporter, he is making an extrapolation into the past. He is expressing a belief that by whatever definition he is using the person walking into the teleporter meets a standard of meness that the clones do not. Unless the definition under consideration explicitly reference You!Mars’ mental state (e.g. “by definition” he has shared identity with people he remembers having shared identity with), then the validity of that belief is external: it is either true or false. The map is not the territory.
Under an assumption of pattern or causal continuity, for example, it would be explicitly true. For computational continuity it would be false.
If I understood you correctly, then on your account, his claim is simply false, but he isn’t necessarily lying.
Yes?
It seems to follow that he might actually have a sense of subjective experience, which is separate from memories or shared history, which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.
Yes?
Yes, in the sense that it is a belief about his own history which is either true or false like any historical fact. Whether it actually false depends on the nature of “personal identity”. If I understand the original post correctly, I think Eliezer would argue that his claim is true. I think Eliezer’s argument lacks sufficient justification, and there’s a good chance his claim is false.
Yes. My question is: is that belief justified?
If your memory were altered such to make you think you won the lottery, that doesn’t make you any richer. Likewise You!Mars’ memory was constructed by the transporter machine in such a way, following the transmitted design as to make him remember stepping into the transporter on Earth as you did, and walking out of it on Mars in seamless continuity. But just because he doesn’t remember the deconstruction, information transmission, and reconstruction steps doesn’t mean they didn’t happen. Once he learns what actually happened during his transport, his decision about whether he remains the same person that entered the machine on Earth depends greatly on his model of consciousness and personal identity/continuity.
OK, understood.
Here’s my confusion: a while back, you said:
And yet, here’s Dave!Mars, who has a sense of subjective experience separate from memories or shared history which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.
But on your account, he might not have Dave’s personal identity.
So, where is this sense of subjective experience coming from, on your account? Is it causally connected to personal identity, or not?
Yes, that’s certainly true. By the same token, if I convince you that I placed you in stasis last night for… um… long enough to disrupt your personal identity (a minute? an hour? a millisecond? a nanosecond? how long a period of “computational discontinuity” does it take for personal identity to evaporate on your account, anyway?), you would presumably conclude that you aren’t the same person who went to bed last night. OTOH, if I placed you in stasis last night and didn’t tell you, you’d conclude that you’re the same person, and live out the rest of your life none the wiser.