If I understood you correctly, then on your account, his claim is simply false, but he isn’t necessarily lying.
Yes?
It seems to follow that he might actually have a sense of subjective experience, which is separate from memories or shared history, which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.
If I understood you correctly, then on your account, his claim is simply false, but he isn’t necessarily lying.
Yes, in the sense that it is a belief about his own history which is either true or false like any historical fact. Whether it actually false depends on the nature of “personal identity”. If I understand the original post correctly, I think Eliezer would argue that his claim is true. I think Eliezer’s argument lacks sufficient justification, and there’s a good chance his claim is false.
It seems to follow that he might actually have a sense of subjective experience, which is separate from memories or shared history, which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.
Yes. My question is: is that belief justified?
If your memory were altered such to make you think you won the lottery, that doesn’t make you any richer. Likewise You!Mars’ memory was constructed by the transporter machine in such a way, following the transmitted design as to make him remember stepping into the transporter on Earth as you did, and walking out of it on Mars in seamless continuity. But just because he doesn’t remember the deconstruction, information transmission, and reconstruction steps doesn’t mean they didn’t happen. Once he learns what actually happened during his transport, his decision about whether he remains the same person that entered the machine on Earth depends greatly on his model of consciousness and personal identity/continuity.
It seems to follow that he might actually have a sense of subjective experience, which is separate from memories or shared history, which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones. Yes. My question is: is that belief justified?
That sense of subjective experience separate from memories or shared history is what I have been calling “personal identity.” It is what gives me the belief, real or illusory, that I am the same person from moment to moment, day to day, and what separates me from my clones.
And yet, here’s Dave!Mars, who has a sense of subjective experience separate from memories or shared history which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.
But on your account, he might not have Dave’s personal identity.
So, where is this sense of subjective experience coming from, on your account? Is it causally connected to personal identity, or not?
Once he learns what actually happened during his transport, his decision about whether he remains the same person that entered the machine on Earth depends greatly on his model of consciousness and personal identity/continuity.
Yes, that’s certainly true. By the same token, if I convince you that I placed you in stasis last night for… um… long enough to disrupt your personal identity (a minute? an hour? a millisecond? a nanosecond? how long a period of “computational discontinuity” does it take for personal identity to evaporate on your account, anyway?), you would presumably conclude that you aren’t the same person who went to bed last night. OTOH, if I placed you in stasis last night and didn’t tell you, you’d conclude that you’re the same person, and live out the rest of your life none the wiser.
If I understood you correctly, then on your account, his claim is simply false, but he isn’t necessarily lying.
Yes?
It seems to follow that he might actually have a sense of subjective experience, which is separate from memories or shared history, which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.
Yes?
Yes, in the sense that it is a belief about his own history which is either true or false like any historical fact. Whether it actually false depends on the nature of “personal identity”. If I understand the original post correctly, I think Eliezer would argue that his claim is true. I think Eliezer’s argument lacks sufficient justification, and there’s a good chance his claim is false.
Yes. My question is: is that belief justified?
If your memory were altered such to make you think you won the lottery, that doesn’t make you any richer. Likewise You!Mars’ memory was constructed by the transporter machine in such a way, following the transmitted design as to make him remember stepping into the transporter on Earth as you did, and walking out of it on Mars in seamless continuity. But just because he doesn’t remember the deconstruction, information transmission, and reconstruction steps doesn’t mean they didn’t happen. Once he learns what actually happened during his transport, his decision about whether he remains the same person that entered the machine on Earth depends greatly on his model of consciousness and personal identity/continuity.
OK, understood.
Here’s my confusion: a while back, you said:
And yet, here’s Dave!Mars, who has a sense of subjective experience separate from memories or shared history which gives him the belief, real or illusory (in this case illusory), that he is the same person from moment to moment, day to day, and the same person who walked into the teleporter, and which separates him from his clones.
But on your account, he might not have Dave’s personal identity.
So, where is this sense of subjective experience coming from, on your account? Is it causally connected to personal identity, or not?
Yes, that’s certainly true. By the same token, if I convince you that I placed you in stasis last night for… um… long enough to disrupt your personal identity (a minute? an hour? a millisecond? a nanosecond? how long a period of “computational discontinuity” does it take for personal identity to evaporate on your account, anyway?), you would presumably conclude that you aren’t the same person who went to bed last night. OTOH, if I placed you in stasis last night and didn’t tell you, you’d conclude that you’re the same person, and live out the rest of your life none the wiser.