Physical view: The maintenance of personal identity requires bodily continuity. So, for instance, one cannot preserve a person by downloading their psychological state into a computer.
Psychological view: The maintenance of personal identity requires continuity of psychological states. As long as there is a continuing stream of psychological states with the appropriate causal relations between them, the person persists.
I answered “psychological”, but I should perhaps note that I don’t understand “continuing” to imply “uninterrupted”. I have no problem with the idea of a personal identity that is shut down for a while before being booted back up (with its internal state saved), or one that is computed on a timesharing system.
Yeah, I should have clarified. “Continuity” here does not mean temporal continuity; it means causal continuity. Future states are appropriately causally related to past states. So if I disintegrate right now, and simultaneously, by some bizarre chance, an atom-for-atom duplicate of me is produced by a thermal fluctuation on Mars, that duplicate would not be me, since the appropriate causal connections between my psychological state and his are lacking.
My instinct is to say that I don’t require causal continuity either… e.g., to say that if I appeared on Mars I would consider myself to still be the person who used to exist on Earth, despite the lack of causal connection. That said, I don’t really take that instinct seriously, since I’m incapable of actually imagining this happening without positing a causal connection I just don’t happen to be aware of. The alternative is… well, unimaginable.
So, I dunno. Maybe I do require continuity in that sense, I’m just willing to posit it for any sufficiently complex system.
So where does the information to build the copy of you on Mars come from? It’s all fine and well to say “thermal noise” but if you allow for brains to be built from thermal noise with any sort of frequency, you end up with the bigger philosophical problem of Boltzmann brains. Unless you’re proposing a mechanism by which the brain in question is your brain, in which case you’ve reintroduced causality.
It’s all fine and well to say “thermal noise” but if you allow for brains to be built from thermal noise with any sort of frequency, you end up with the bigger philosophical problem of Boltzmann brains.
I agree that Boltzmann brains are a philosophical problem, but they’re a problem precisely because our current best physical theories tell us that brains can fluctuate into existence. I don’t think the right way to deal with the problem is to say, “Boltzmann brains are problematic, so let’s just deny that they can exist.”
but they’re a problem precisely because our current best physical theories tell us that brains can fluctuate into existence.
Yes, but our current best physical theories also mean that they probably fluctuate into existence considerably less often than they form under normal circumstances (human brains, at least). A mind is a complex thing, so the amount of information it takes to replicate a mind is probably far higher than the amount of information it takes to specify an environment likely to give rise to a mind. If you discard the causal process that gives rise to minds in practice and postulate thermal noise as the cause instead, you end up postulating Boltzmann brains as well.
I didn’t mean that Boltzmann brains are a particularly big philosophical problem, just that they become one when you try to do philosophy where you postulate very specific things occurring by “random chance”.
That’s not how I understood the question. Now it turns out my vote is wrong.
I should not have voted without understanding the question fully. But if I read the survey where the questions are taken from, I will probably also learn what is considered there to be the mainstream position, which will bias my answers.
Causal descent is a necessary but not sufficient condition, just like a QM-ignorant “physicalist” doesn’t necessarily believe that if I grind you up and make a new person out of those “particular particles”, it is the same person just in virtue of being made out the “same particles”. Not that there’s any such thing as the “same particles” in modern physics, just waves in a particle field, etc.
Right, but causal descent is common to the physical and psychological views. ‘Physicalism’ among philosophers generally doesn’t refer to some kind of ‘same atoms’ view. That’s an incoherent view long before we bring in considerations of quantum physics, and the ‘same particles’ issue. Mostly that kind of physicalism is restricted to people who are wrong on the internet.
Physicalism among (most) philosophers who hold that view is the claim that your identity is tied to a particular animal (or whatever hardware) that has physical persistance conditions (like the processes which keep it alive, etc.). If you create an atom-for-atom duplicate of that animal, and then kill one of the two of them, you haven’t therefore killed both of them. They’re not identical in that sense, and that’s the sense of ‘identity’ that physicalists are calling personal identity.
So nothing about quantum physics, so far as I can see, makes a difference to this question.
I answered “lean toward psychological view”, as I’m about evenly split between the psychological view and just considering the whole concept of personal identity incoherent/a wrong question in the first place.
Other: I suspect the answer to this question depends on the particular question you’re asking. Often, I think, this is a values question—e.g. in what form do I want to continue existing?
Other: I think that personal identity is in a certain sort of brain process, rather than a static “brain state.” (so, no “static minds”) The physical view implies that personal identity can be reduced to certain brain states, and I reject that: a valid sort of active processing is important.
Physical view/other (?). I consider personal identity to be based only on the memory of past self, not actual brain process or existence of memory independent conscious mind. It may be very well that the experience of personal continuity is only an illusion as there is no actual continuity at all, only the recollection of previous events (including the attempts of introspection). An exact copy of me wouldn’t be any different than myself.
Other: For everyone else that I observe, an exact atom-for-atom duplicate is the same person as the original. If a copy of me were made, my ‘mind’ would reside in the original.
I accept that my duplicate would claim to be the original, of course.
I’m not sure what you mean by “the original” here.
Suppose the atom-for-atom duplicate were constructed (for sound technical reasons) inside a duplication chamber, and it came to awareness inside that chamber. Would it claim that it had somehow had been swapped into the chamber and the duplicate swapped out, without it noticing? Or would it acknowledge that it had been constructed in the duplication chamber, but claim to be the original nonetheless?
Whether the duplicate claimed to be the original or not depends on the individual, I suppose.
If I lived in a world that contained such duplication chambers, and found myself waking up in one, I would not know whether I was “the copy” or not without some outside evidence. I’d be inclined to accept that either I was a copy, or someone was playing a trick on me to make me think so.
I understand that the duplicate would have the same memories and personality as me, but would not have my subjective sense of experience.
You said in this case you’d be inclined to accept that either you were a copy, or someone was playing a trick on me to make you think so. Which makes sense.
Would your duplicate be equally inclined to do the same thing in the same case?
If so… then why would your duplicate claim to be the original? If not… what accounts for the difference?
In a world that has duplicators, my duplicate would not claim to be original without evidence one way or the other.
In our real world, if a copy of me were made using “magic”, both versions would believe themselves to be the original (at least at first). I had this kind of very specific scenario in mind when I said both would claim to be original, but did not explain this in the earlier comment (inferential distance and all that).
I don’t know: If someone I knew had their physical body destroyed but they were uploaded with complete accuracy, I would consider them to be the same person (consistent with psychological view). I would not opt for that procedure for myself, though, because I don’t accept that my upload would really be me (more like physical view).
I’m open to evidence and argument on this, though.
If someone I knew had their physical body destroyed but they were uploaded with complete accuracy, I would consider them to be the same person (consistent with psychological view) [...] I don’t accept that my upload would really be me (more like physical view).
I can observe myself in a way I that can’t others.
From my vantage point, a copy or upload of someone else behaves the same as the ‘original’. From that same vantage point, a newly created copy of myself is clearly ‘outside’ my mind and therefore observationally different.
They may not be you(now), but if you count yourself as the same person as you(earlier), then they have to be the same person as you(earlier) as well. I think.
A newly created copy or electronic upload of me (call him ‘Copy B’) would have all my behavioral attributes and memories. He could be called $myName by anyone else observing either of us (we could be indistinguishable to a third observer).
However, to me (the guy writing this response, call me ‘Copy A’), there would be an obvious observable difference between Copy A and Copy B. I see the world from Copy A’s point of view, with his eyes and ears and I would observe Copy B from the outside as I would any other person, without knowing what is going on in his mind or experiencing the world from his point of view. Yes, Copy B might say the same about Copy A, but it’s my fear that Copy A would never find himself genuinely waking up inside a copying chamber or as an upload. If that’s true, uploading myself would be the death of my subjective point of view.
I get where you’re coming from. I don’t necessarily have an epiphenomenal view of the mind, but I also believe that the concept of qualia is not well understood by anyone. I do not understand why I’m me and not someone else, and neither does our current knowledge on the subject.
Based on this I’m agnostic on whether mind uploading in the style we’re discussing would really preserve me and my stream of qualia, or kill me and create another person with a new stream of qualia. Without any evidence that it would preserve me, I would not accept going through such a process.
There are possible scenarios in which the copying process could preserve what I consider to be me: For example, if there is only one observer at all, who experiences all qualia streams throughout the world (that possibility scares me, honestly). Another possibility might be that copying me would simply double my measure in the world, and what I consider my qualia stream would have twice as many experiences after the copying process. These are just speculation, though.
This has definitely been an interesting discussion for me. Examining my thoughts on this subject has raised more possible interpretations than settled anything, though!
I had the same reaction, but the majority of others I’ve talked to disagree with me, so it’s nice to see someone who thinks the same way. Here are my arguments with TheOtherDave (Ironic, I know!):
No, I understand that, I’m saying that, while Copy B is not the same person as Copy A, he IS the same person as Copy A was before being copied, at least as much as Copy A is.
What would you do if you discovered you were Copy B in such an experiment? Because presumably he would do the same thing.
Other: Psychological states include some bodily states like sensory input, and possibly as much as “social contact.” There is no firm boundary between physical and psychological. Uploads are possible, but will require emulation of more than just a brain. An atom-by-atom instantiation of the same mind (including enough of the body and environment) will be the same person.
I believe, the distinction is whether ‘you’ are the physical atoms that make up your brain, or just the pattern those atoms form. So you were instantaneously replaced by an identical copy, would you still be the same person?
Personal identity: physical view or psychological view?
[pollid:94]
Physical view: The maintenance of personal identity requires bodily continuity. So, for instance, one cannot preserve a person by downloading their psychological state into a computer.
Psychological view: The maintenance of personal identity requires continuity of psychological states. As long as there is a continuing stream of psychological states with the appropriate causal relations between them, the person persists.
I answered “psychological”, but I should perhaps note that I don’t understand “continuing” to imply “uninterrupted”. I have no problem with the idea of a personal identity that is shut down for a while before being booted back up (with its internal state saved), or one that is computed on a timesharing system.
Yeah, I should have clarified. “Continuity” here does not mean temporal continuity; it means causal continuity. Future states are appropriately causally related to past states. So if I disintegrate right now, and simultaneously, by some bizarre chance, an atom-for-atom duplicate of me is produced by a thermal fluctuation on Mars, that duplicate would not be me, since the appropriate causal connections between my psychological state and his are lacking.
Hm.
My instinct is to say that I don’t require causal continuity either… e.g., to say that if I appeared on Mars I would consider myself to still be the person who used to exist on Earth, despite the lack of causal connection.
That said, I don’t really take that instinct seriously, since I’m incapable of actually imagining this happening without positing a causal connection I just don’t happen to be aware of. The alternative is… well, unimaginable.
So, I dunno. Maybe I do require continuity in that sense, I’m just willing to posit it for any sufficiently complex system.
So where does the information to build the copy of you on Mars come from? It’s all fine and well to say “thermal noise” but if you allow for brains to be built from thermal noise with any sort of frequency, you end up with the bigger philosophical problem of Boltzmann brains. Unless you’re proposing a mechanism by which the brain in question is your brain, in which case you’ve reintroduced causality.
I agree that Boltzmann brains are a philosophical problem, but they’re a problem precisely because our current best physical theories tell us that brains can fluctuate into existence. I don’t think the right way to deal with the problem is to say, “Boltzmann brains are problematic, so let’s just deny that they can exist.”
Yes, but our current best physical theories also mean that they probably fluctuate into existence considerably less often than they form under normal circumstances (human brains, at least). A mind is a complex thing, so the amount of information it takes to replicate a mind is probably far higher than the amount of information it takes to specify an environment likely to give rise to a mind. If you discard the causal process that gives rise to minds in practice and postulate thermal noise as the cause instead, you end up postulating Boltzmann brains as well.
I didn’t mean that Boltzmann brains are a particularly big philosophical problem, just that they become one when you try to do philosophy where you postulate very specific things occurring by “random chance”.
That’s not how I understood the question. Now it turns out my vote is wrong.
I should not have voted without understanding the question fully. But if I read the survey where the questions are taken from, I will probably also learn what is considered there to be the mainstream position, which will bias my answers.
Other: Leaning toward a causal view. In other words, your past self has to be the cause of your future self, but the specific atoms are irrelevant.
Holy crap! I’m identical with my kid!
Causal descent is a necessary but not sufficient condition, just like a QM-ignorant “physicalist” doesn’t necessarily believe that if I grind you up and make a new person out of those “particular particles”, it is the same person just in virtue of being made out the “same particles”. Not that there’s any such thing as the “same particles” in modern physics, just waves in a particle field, etc.
Right, but causal descent is common to the physical and psychological views. ‘Physicalism’ among philosophers generally doesn’t refer to some kind of ‘same atoms’ view. That’s an incoherent view long before we bring in considerations of quantum physics, and the ‘same particles’ issue. Mostly that kind of physicalism is restricted to people who are wrong on the internet.
Physicalism among (most) philosophers who hold that view is the claim that your identity is tied to a particular animal (or whatever hardware) that has physical persistance conditions (like the processes which keep it alive, etc.). If you create an atom-for-atom duplicate of that animal, and then kill one of the two of them, you haven’t therefore killed both of them. They’re not identical in that sense, and that’s the sense of ‘identity’ that physicalists are calling personal identity.
So nothing about quantum physics, so far as I can see, makes a difference to this question.
Holly crap, identical withe everybody ever lived. Except those of course, who were not self aware. If such exist.
Based on pragmatist’s interpretation, this sounds like the physical view.
It sounds like the psychological view to me, although I guess that depends on what Eliezer means by “self”.
This confuses me. I’m a bunch of LessWrong posts?
I voted Accept: psychological view.
Voted other for essentially this reason. Still very confused about this question.
I answered “lean toward psychological view”, as I’m about evenly split between the psychological view and just considering the whole concept of personal identity incoherent/a wrong question in the first place.
Same. I think I might possibly be describable as more of an anticriterian than anything else.
Other: I suspect the answer to this question depends on the particular question you’re asking. Often, I think, this is a values question—e.g. in what form do I want to continue existing?
Other: I think that personal identity is in a certain sort of brain process, rather than a static “brain state.” (so, no “static minds”) The physical view implies that personal identity can be reduced to certain brain states, and I reject that: a valid sort of active processing is important.
Physical view/other (?). I consider personal identity to be based only on the memory of past self, not actual brain process or existence of memory independent conscious mind. It may be very well that the experience of personal continuity is only an illusion as there is no actual continuity at all, only the recollection of previous events (including the attempts of introspection). An exact copy of me wouldn’t be any different than myself.
Other: For everyone else that I observe, an exact atom-for-atom duplicate is the same person as the original. If a copy of me were made, my ‘mind’ would reside in the original.
I accept that my duplicate would claim to be the original, of course.
I’m not sure what you mean by “the original” here.
Suppose the atom-for-atom duplicate were constructed (for sound technical reasons) inside a duplication chamber, and it came to awareness inside that chamber. Would it claim that it had somehow had been swapped into the chamber and the duplicate swapped out, without it noticing? Or would it acknowledge that it had been constructed in the duplication chamber, but claim to be the original nonetheless?
Whether the duplicate claimed to be the original or not depends on the individual, I suppose.
If I lived in a world that contained such duplication chambers, and found myself waking up in one, I would not know whether I was “the copy” or not without some outside evidence. I’d be inclined to accept that either I was a copy, or someone was playing a trick on me to make me think so.
I understand that the duplicate would have the same memories and personality as me, but would not have my subjective sense of experience.
OK, now I’m confused.
You said in this case you’d be inclined to accept that either you were a copy, or someone was playing a trick on me to make you think so. Which makes sense.
Would your duplicate be equally inclined to do the same thing in the same case?
If so… then why would your duplicate claim to be the original?
If not… what accounts for the difference?
Yes, my duplicate would think the same way as me.
In a world that has duplicators, my duplicate would not claim to be original without evidence one way or the other.
In our real world, if a copy of me were made using “magic”, both versions would believe themselves to be the original (at least at first). I had this kind of very specific scenario in mind when I said both would claim to be original, but did not explain this in the earlier comment (inferential distance and all that).
That’s not ‘other’, that’s the psychological view (assuming you would still say they’re the same person if one was duplicated minus a left pinky).
I don’t know: If someone I knew had their physical body destroyed but they were uploaded with complete accuracy, I would consider them to be the same person (consistent with psychological view). I would not opt for that procedure for myself, though, because I don’t accept that my upload would really be me (more like physical view).
I’m open to evidence and argument on this, though.
This is consistent how?
Well, I’m not obligated to use the same standards for myself as other people.
I can observe myself in a way I that can’t others.
From my vantage point, a copy or upload of someone else behaves the same as the ‘original’. From that same vantage point, a newly created copy of myself is clearly ‘outside’ my mind and therefore observationally different.
But surely the copy is as much the same person as the “you” of five minutes ago as the original?
To you and everyone else, but not to me.
They may not be you(now), but if you count yourself as the same person as you(earlier), then they have to be the same person as you(earlier) as well. I think.
A newly created copy or electronic upload of me (call him ‘Copy B’) would have all my behavioral attributes and memories. He could be called $myName by anyone else observing either of us (we could be indistinguishable to a third observer).
However, to me (the guy writing this response, call me ‘Copy A’), there would be an obvious observable difference between Copy A and Copy B. I see the world from Copy A’s point of view, with his eyes and ears and I would observe Copy B from the outside as I would any other person, without knowing what is going on in his mind or experiencing the world from his point of view. Yes, Copy B might say the same about Copy A, but it’s my fear that Copy A would never find himself genuinely waking up inside a copying chamber or as an upload. If that’s true, uploading myself would be the death of my subjective point of view.
I get where you’re coming from. I don’t necessarily have an epiphenomenal view of the mind, but I also believe that the concept of qualia is not well understood by anyone. I do not understand why I’m me and not someone else, and neither does our current knowledge on the subject.
Based on this I’m agnostic on whether mind uploading in the style we’re discussing would really preserve me and my stream of qualia, or kill me and create another person with a new stream of qualia. Without any evidence that it would preserve me, I would not accept going through such a process.
There are possible scenarios in which the copying process could preserve what I consider to be me: For example, if there is only one observer at all, who experiences all qualia streams throughout the world (that possibility scares me, honestly). Another possibility might be that copying me would simply double my measure in the world, and what I consider my qualia stream would have twice as many experiences after the copying process. These are just speculation, though.
This has definitely been an interesting discussion for me. Examining my thoughts on this subject has raised more possible interpretations than settled anything, though!
I had the same reaction, but the majority of others I’ve talked to disagree with me, so it’s nice to see someone who thinks the same way. Here are my arguments with TheOtherDave (Ironic, I know!):
Teleporter Malfunction Scenario
No, I understand that, I’m saying that, while Copy B is not the same person as Copy A, he IS the same person as Copy A was before being copied, at least as much as Copy A is.
What would you do if you discovered you were Copy B in such an experiment? Because presumably he would do the same thing.
I don’t contest your first paragraph.
Regarding your question: I don’t know. Probably update my understanding of this subject.
Other: Psychological states include some bodily states like sensory input, and possibly as much as “social contact.” There is no firm boundary between physical and psychological. Uploads are possible, but will require emulation of more than just a brain. An atom-by-atom instantiation of the same mind (including enough of the body and environment) will be the same person.
Psychological view: This question is standing in for a particular question of values. Otherwise it’s meaningless.
What is the psychological view, if it’s non-physical(ist)?
I believe, the distinction is whether ‘you’ are the physical atoms that make up your brain, or just the pattern those atoms form. So you were instantaneously replaced by an identical copy, would you still be the same person?
Has implications for things like uploading.