Meat or sim or both meat aren’t issues as I see it. I am self, fuck non-self, or not to be a dick, value non-self less than self, certainly on existential issues. “I” am the awareness within this mind. “I” am not memory, thought, feeling, personality, etc. I know I am me ipso facto as the observer of the knower. I don’t care if I was cloned yesterday or one second ago and there are many theoretical circumstances where this could be the case. I value the “I” that I currently am just exactly now. I don’t believe that this “I” is particularly changeable. I fear senility because “I” am the entity which will be aware of the unpleasant thoughts and feeling associated with the memory loss and the fear of worsening status and eventual nightmarish half-life of idiocy. That being will be unrecognizable as me on many levels but it is me whereas a perfect non-senile copy is not me, although he has the experience of feeling exactly as I would, including the same stubborn ideas about his own importance over any other copies or originals.
I don’t believe that this “I” is particularly changeable
I don’t know what you mean by that.
Why can’t a perfect copy be you? Doesn’t that involve epiphenomenalism? Even if I give the entire state of the world X time in the future, I’d also need to specify which identical beings are “you”.
It’s a sticky topic, consciousness. I edited my post to clarify further:
I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding.
Which I recognize might sound a bit mystical to some, but I believe it is a real thing which is a function of brain activity.
As a function of brain (or whatever processing medium) consciousness or self is tied to matter. The consciousness in the matter that is experiencing this consciousness is me. I’m not sure if any transfer to alternate media is possible. The same matter can’t be in two different places. Therefore every consciousness is a unique entity, although identical ones can exist via copying. I am the one aware of this mind as the body is typing. You are the one aware of the mind reading it. Another might have the same experience but that won’t have any intrinsic value to You or I.
If I copy myself and am destroyed in the process, is the copy me? If I copy myself and am not destroyed, are the copy and the original both me? If I am a product of brain function (otherwise I am a magical soul) and if both are me then my brain is a single set of matter in two locations. Are they We? That gets interesting. Lots to think about but I stand with my original position.
If I gradually replace every atom in your brain with a different one until you have no atoms left, but you still function the same, are you still “you”? If not, at what point did you stop?
Have you seen Yudkowsky’s series of posts on this?
I’m familiar with the concept, not his specific essays, and, indeed, this literally does happen. Our neurons are constantly making and unmaking largely proteinaceous components of themselves and, over a lifetime, there is a no doubt partial, perhaps complete, turnover of the brain’s atoms. I find this not problematic at all because the electrochemical processes which create consciousness go on undisturbed. The idea that really queers my argument for me is that of datum-by-datum transfer, each new datum, in the form of neuronal activity (the 1s and 0s of electrical discharge or non-) is started by my current brain but saved in another. Knee-jerk, I would tend to say that transfer is a complete one and myself has been maintained. The problem comes when a copy, run in parallel until completed and then cloven (good luck untying that knot), rather than a transfer is made by the exact same datum-by-datum process. At the end I have 2 beings who seem to meet my definition of Me.
However, this argument does not convince me of the contrary position of the sameness of self and copy, and it does nothing to make me care about a me-like copy coming into being a thousand years from now, and does not induce me to step up onto Dr Bowie-Tesla’s machine.
At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago? How about one who appears 1,000,000 years from now? The exact eyes that see these words will be the ones in the water. I can’t come up with any answer other that “fuck that guy”. I might just be a glass-half empty kind of guy, but someone always ends up stuck in the meat, and it’s going to be that being which remains behind these eyes.
Note that you can believe everyone involved is “you”, and yet not care about them. The two questions aren’t completely orthogonal, but identifying someone with yourself doesn’t imply you should care about them.
At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago?
The same price I would accept to have the last second erased from my memory, but first feel the pain of drowning. That’s actually not so easy to set. I’m not sure how much it would cost to get me to accept X amount of pain plus removal of the memory, but it’s probably less than the cost for X amount of pain alone.
How about one who appears 1,000,000 years from now?
That’s like removing the last second of memory, plus pain, plus jumping forward in time. I’d probably only do it if I had a guarantee that I’d survive and be able to get used to whatever goes on in the future and be happy.
Meat or sim or both meat aren’t issues as I see it. I am self, fuck non-self, or not to be a dick, value non-self less than self, certainly on existential issues. “I” am the awareness within this mind. “I” am not memory, thought, feeling, personality, etc. I know I am me ipso facto as the observer of the knower. I don’t care if I was cloned yesterday or one second ago and there are many theoretical circumstances where this could be the case. I value the “I” that I currently am just exactly now. I don’t believe that this “I” is particularly changeable. I fear senility because “I” am the entity which will be aware of the unpleasant thoughts and feeling associated with the memory loss and the fear of worsening status and eventual nightmarish half-life of idiocy. That being will be unrecognizable as me on many levels but it is me whereas a perfect non-senile copy is not me, although he has the experience of feeling exactly as I would, including the same stubborn ideas about his own importance over any other copies or originals.
I don’t know what you mean by that.
Why can’t a perfect copy be you? Doesn’t that involve epiphenomenalism? Even if I give the entire state of the world X time in the future, I’d also need to specify which identical beings are “you”.
It’s a sticky topic, consciousness. I edited my post to clarify further:
I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding.
Which I recognize might sound a bit mystical to some, but I believe it is a real thing which is a function of brain activity.
As a function of brain (or whatever processing medium) consciousness or self is tied to matter. The consciousness in the matter that is experiencing this consciousness is me. I’m not sure if any transfer to alternate media is possible. The same matter can’t be in two different places. Therefore every consciousness is a unique entity, although identical ones can exist via copying. I am the one aware of this mind as the body is typing. You are the one aware of the mind reading it. Another might have the same experience but that won’t have any intrinsic value to You or I.
If I copy myself and am destroyed in the process, is the copy me? If I copy myself and am not destroyed, are the copy and the original both me? If I am a product of brain function (otherwise I am a magical soul) and if both are me then my brain is a single set of matter in two locations. Are they We? That gets interesting. Lots to think about but I stand with my original position.
If I gradually replace every atom in your brain with a different one until you have no atoms left, but you still function the same, are you still “you”? If not, at what point did you stop?
Have you seen Yudkowsky’s series of posts on this?
I’m familiar with the concept, not his specific essays, and, indeed, this literally does happen. Our neurons are constantly making and unmaking largely proteinaceous components of themselves and, over a lifetime, there is a no doubt partial, perhaps complete, turnover of the brain’s atoms. I find this not problematic at all because the electrochemical processes which create consciousness go on undisturbed. The idea that really queers my argument for me is that of datum-by-datum transfer, each new datum, in the form of neuronal activity (the 1s and 0s of electrical discharge or non-) is started by my current brain but saved in another. Knee-jerk, I would tend to say that transfer is a complete one and myself has been maintained. The problem comes when a copy, run in parallel until completed and then cloven (good luck untying that knot), rather than a transfer is made by the exact same datum-by-datum process. At the end I have 2 beings who seem to meet my definition of Me.
However, this argument does not convince me of the contrary position of the sameness of self and copy, and it does nothing to make me care about a me-like copy coming into being a thousand years from now, and does not induce me to step up onto Dr Bowie-Tesla’s machine.
At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago? How about one who appears 1,000,000 years from now? The exact eyes that see these words will be the ones in the water. I can’t come up with any answer other that “fuck that guy”. I might just be a glass-half empty kind of guy, but someone always ends up stuck in the meat, and it’s going to be that being which remains behind these eyes.
Can you read http://lesswrong.com/lw/qp/timeless_physics/, http://lesswrong.com/lw/qx/timeless_identity/, and http://lesswrong.com/lw/qy/why_quantum/, with any relevant posts linked therein? (Or just start at the beginning of the quantum sequence.)
Note that you can believe everyone involved is “you”, and yet not care about them. The two questions aren’t completely orthogonal, but identifying someone with yourself doesn’t imply you should care about them.
The same price I would accept to have the last second erased from my memory, but first feel the pain of drowning. That’s actually not so easy to set. I’m not sure how much it would cost to get me to accept X amount of pain plus removal of the memory, but it’s probably less than the cost for X amount of pain alone.
That’s like removing the last second of memory, plus pain, plus jumping forward in time. I’d probably only do it if I had a guarantee that I’d survive and be able to get used to whatever goes on in the future and be happy.