What if I rewire your neurons so you think you’re Donald Trump? Would that make you Donald Trump? If Mr. Trump died in a tragic boating accident tomorrow, could his family rest easy knowing that he didn’t actually experience death, but lives on in you?
If you rewrite my nuerons such that I have all of Donald Trump’s memories (or connections) and none of my own, yes. If you only rewrite my name, no, for I would still identify with the memories. There’s lots of space between those where I’m partially me and partially him, and I would hazard to forward-identify with beings in proportion to how much of my current memories they retain, possibly diluted by their additional memories.
Ok, what if—like Eternal Sunshine of the Spotless Mind—I slowly over a period of time eliminate your memories. Then maybe—like Dark City—I go in and insert new memories, maybe generic, maybe taken from someone else. This can be done either fast or slowly if it matters.
This future continuation of your current self will have nothing other than a causal & computational connection to your current identity. No common memories whatsoever. Would you expect to experience what this future person experiences?
Would you expect to experience what this future person experiences?
Based on your other comments, I infer that you consider this question entirely different from the question “Are you willing to consider this future person you?” Confirm?
Cool, thanks. Given that, and answering for my own part: I’m not sure what any person at any time would possibly ever observe differentially in one case or the other, so I honestly have no idea what I’d be expecting or not expecting in this case. That is, I don’t know what the question means, and I’m not sure it means anything at all.
That’s fair enough. You got the point with your first comment, which was to point out that issues of memory-identity and continuous-experience-identity are / could be separate.
Perhaps I understand more than I think I do, then.
It seems to me that what I’m saying here is precisely that those issues can’t be separated, because they predict the same sets of observations. The world in which identity is a function of memory is in all observable ways indistinguishable from the world in which identity is a function of continuous experience. Or, for that matter, of cell lineages or geographical location or numerological equivalence.
And I’m saying that external observations are not all that matters. Indeed it feels odd to me to hold that view when the phenomenon under consideration is subjective experience itself.
You said “predict the same set of observations” which I implicitly took to mean “tell me something I can witness to update my beliefs about which theory is correct,” to which the answer is: there is nothing you—necessarily external—can witness to know whether my upload is death-and-creation or continuation. I alone am privy to that experience (continuation or oblivion), although the recorded memory is the same in either case so there’s no way clone could tell you after.
You could use a model of consciousness and a record of events to infer which outcome occurred. And that’s the root issue here, we have different models of consciousness and therefore make different inferences.
You keep insisting on inserting that “external” into my comment, just as if I had said it, when I didn’t. So let me back up a little and try to be clearer.
Suppose the future continuation you describe of my current self (let’s label him “D” for convenience) comes to exist in the year 2034.
Suppose D reads this exchange in the archives of LessWrong, and happens to idly wonder whether they, themselves, are in fact the same person who participated in LessWrong under the username TheOtherDave back in January of 2014, but subsequently went through the process you describe.
“Am I the same person as TheOtherDave?” D asks. “Is TheOtherDave having my experiences?”
What ought D expect to differentially observe if the answer were “yes” vs. “no”? This is not a question about external observations, as there’s no external observer to make any such observations. It’s simply a question about observations.
And as I said initially, it seems clear to me that no such differentially expected observations exist… not just no external observations, but no observations period. As you say, it’s just a question about models—specifically, what model of identity D uses.
Similarly, whether I expect to be the same person experiencing what D experiences is a question about what model of identity I use.
And if D and I disagree on the matter, neither of us is wrong, because it’s not the sort of question we can be wrong about. We’re “not even wrong,” as the saying goes. We simply have different models of identity, and there’s no actual territory for those models to refer to. There’s no fact of the matter.
Similarly, if I decide that you and I are really the same person, even though I know we don’t share any memories or physical cells or etc., because I have a model of identity that doesn’t depend on any of that stuff… well, I’m not even wrong about that.
When TheOtherDave walks into the destructive uploader, either he wakes up in a computer or he ceases to exist experiences no more. Not being able to experimentally determine what happened afterwards doesn’t change that fact that one of those descriptions matches what you experience and the other does not.
What do I experience in the first case that fails to match what I experience in the other?
That is, if TheOtherDave walks into the destructive uploader and X wakes up in a computer, how does X answer the question “Am I TheOtherDave?”
Again, I’m not talking about experimental determination. I’m talking about experience. You say that one description matches my experience and the other doesn’t… awesome! What experiences should I expect X to have in each case?
It sounds like your answer is that X will reliably have exactly the same experiences in each case, and so will every other experience-haver in the universe, but in one case they’re wrong and in the other they’re right.
Which, OK, if that’s your answer, I’ll drop the subject there, because you’re invoking an understanding of what it means to be wrong and right about which I am profoundly indifferent.
how does X answer the question “Am I TheOtherDave?”
This is so completely unrelated to what I am talking about. Completely out of left field. How the upload/clone answers or fails to answer the question “Am I TheOtherDave?” is irrelevant to the question at hand: what did TheOtherDave experience when he walked into the destructive uploader.
I’ve rephrased this as many times as I know how, but apparently I’m not getting through. I give up; this is my last reply.
Of course not. But what does thinking you’re Donald Trump have to do with it? The question at hand is not about who I think I am, but what properties I have.
Ah, OK. You confused me by bringing up trist thinking they were Donald Trump, which seemed unrelated.
For my own part, I’m not sure why I should care about the nominal difference between two things with identical properties, regardless of how continuous their subjective experience is/has been, and regardless of whether one of them is me.
But I acknowledge that some people do care about that. And not just for subjective experience… some people care about the difference between an original artwork and a perfectly identical copy of it, for example, because the continuity of the original’s existence is important to them, even though they don’t posit the artwork has subjective experiences.
That’s fine… people value what they value. For my own part, I don’t value continuity in that sense very much at all.
Taboo “think”. If you rewire my neurons* to give me the false propositional belief that I am Donald Trump, then no. If you rewire my neurons to an exact copy of Donald Trump’s, then yes.
And, yes, they could, to the exact same degree that they would accept a miraculously-resuscitated Trump who was amnesiac about the previous day leading up to the boating accident, and also looked totally different now. But this is a looser requirement. There could be a whole bunch of threshold people who would be recognised by my family as a valid continuation of me, but who I could not have anticipated becoming.
*and any other skull-stuff that bears on the problem
What if I rewire your neurons so you think you’re Donald Trump? Would that make you Donald Trump? If Mr. Trump died in a tragic boating accident tomorrow, could his family rest easy knowing that he didn’t actually experience death, but lives on in you?
If you rewrite my nuerons such that I have all of Donald Trump’s memories (or connections) and none of my own, yes. If you only rewrite my name, no, for I would still identify with the memories. There’s lots of space between those where I’m partially me and partially him, and I would hazard to forward-identify with beings in proportion to how much of my current memories they retain, possibly diluted by their additional memories.
Ok, what if—like Eternal Sunshine of the Spotless Mind—I slowly over a period of time eliminate your memories. Then maybe—like Dark City—I go in and insert new memories, maybe generic, maybe taken from someone else. This can be done either fast or slowly if it matters.
This future continuation of your current self will have nothing other than a causal & computational connection to your current identity. No common memories whatsoever. Would you expect to experience what this future person experiences?
Based on your other comments, I infer that you consider this question entirely different from the question “Are you willing to consider this future person you?” Confirm?
Correct.
Cool, thanks. Given that, and answering for my own part: I’m not sure what any person at any time would possibly ever observe differentially in one case or the other, so I honestly have no idea what I’d be expecting or not expecting in this case. That is, I don’t know what the question means, and I’m not sure it means anything at all.
That’s fair enough. You got the point with your first comment, which was to point out that issues of memory-identity and continuous-experience-identity are / could be separate.
Perhaps I understand more than I think I do, then.
It seems to me that what I’m saying here is precisely that those issues can’t be separated, because they predict the same sets of observations. The world in which identity is a function of memory is in all observable ways indistinguishable from the world in which identity is a function of continuous experience. Or, for that matter, of cell lineages or geographical location or numerological equivalence.
And I’m saying that external observations are not all that matters. Indeed it feels odd to me to hold that view when the phenomenon under consideration is subjective experience itself.
I didn’t say “external observations”.
I said “observations.”
If you engage with what I actually said, does it feel any less odd?
You said “predict the same set of observations” which I implicitly took to mean “tell me something I can witness to update my beliefs about which theory is correct,” to which the answer is: there is nothing you—necessarily external—can witness to know whether my upload is death-and-creation or continuation. I alone am privy to that experience (continuation or oblivion), although the recorded memory is the same in either case so there’s no way clone could tell you after.
You could use a model of consciousness and a record of events to infer which outcome occurred. And that’s the root issue here, we have different models of consciousness and therefore make different inferences.
You keep insisting on inserting that “external” into my comment, just as if I had said it, when I didn’t. So let me back up a little and try to be clearer.
Suppose the future continuation you describe of my current self (let’s label him “D” for convenience) comes to exist in the year 2034.
Suppose D reads this exchange in the archives of LessWrong, and happens to idly wonder whether they, themselves, are in fact the same person who participated in LessWrong under the username TheOtherDave back in January of 2014, but subsequently went through the process you describe.
“Am I the same person as TheOtherDave?” D asks. “Is TheOtherDave having my experiences?”
What ought D expect to differentially observe if the answer were “yes” vs. “no”? This is not a question about external observations, as there’s no external observer to make any such observations. It’s simply a question about observations.
And as I said initially, it seems clear to me that no such differentially expected observations exist… not just no external observations, but no observations period. As you say, it’s just a question about models—specifically, what model of identity D uses.
Similarly, whether I expect to be the same person experiencing what D experiences is a question about what model of identity I use.
And if D and I disagree on the matter, neither of us is wrong, because it’s not the sort of question we can be wrong about. We’re “not even wrong,” as the saying goes. We simply have different models of identity, and there’s no actual territory for those models to refer to. There’s no fact of the matter.
Similarly, if I decide that you and I are really the same person, even though I know we don’t share any memories or physical cells or etc., because I have a model of identity that doesn’t depend on any of that stuff… well, I’m not even wrong about that.
When TheOtherDave walks into the destructive uploader, either he wakes up in a computer or he ceases to exist experiences no more. Not being able to experimentally determine what happened afterwards doesn’t change that fact that one of those descriptions matches what you experience and the other does not.
What do I experience in the first case that fails to match what I experience in the other?
That is, if TheOtherDave walks into the destructive uploader and X wakes up in a computer, how does X answer the question “Am I TheOtherDave?”
Again, I’m not talking about experimental determination. I’m talking about experience. You say that one description matches my experience and the other doesn’t… awesome! What experiences should I expect X to have in each case?
It sounds like your answer is that X will reliably have exactly the same experiences in each case, and so will every other experience-haver in the universe, but in one case they’re wrong and in the other they’re right.
Which, OK, if that’s your answer, I’ll drop the subject there, because you’re invoking an understanding of what it means to be wrong and right about which I am profoundly indifferent.
Is that your answer?
This is so completely unrelated to what I am talking about. Completely out of left field. How the upload/clone answers or fails to answer the question “Am I TheOtherDave?” is irrelevant to the question at hand: what did TheOtherDave experience when he walked into the destructive uploader.
I’ve rephrased this as many times as I know how, but apparently I’m not getting through. I give up; this is my last reply.
OK.
Of course not. But what does thinking you’re Donald Trump have to do with it? The question at hand is not about who I think I am, but what properties I have.
No, the question at issue here is continuity of experience, and the subjective experience (or rather lack thereof) when it is terminated—death.
Ah, OK. You confused me by bringing up trist thinking they were Donald Trump, which seemed unrelated.
For my own part, I’m not sure why I should care about the nominal difference between two things with identical properties, regardless of how continuous their subjective experience is/has been, and regardless of whether one of them is me.
But I acknowledge that some people do care about that. And not just for subjective experience… some people care about the difference between an original artwork and a perfectly identical copy of it, for example, because the continuity of the original’s existence is important to them, even though they don’t posit the artwork has subjective experiences.
That’s fine… people value what they value. For my own part, I don’t value continuity in that sense very much at all.
Taboo “think”. If you rewire my neurons* to give me the false propositional belief that I am Donald Trump, then no. If you rewire my neurons to an exact copy of Donald Trump’s, then yes.
And, yes, they could, to the exact same degree that they would accept a miraculously-resuscitated Trump who was amnesiac about the previous day leading up to the boating accident, and also looked totally different now. But this is a looser requirement. There could be a whole bunch of threshold people who would be recognised by my family as a valid continuation of me, but who I could not have anticipated becoming.
*and any other skull-stuff that bears on the problem