If I gradually replace every atom in your brain with a different one until you have no atoms left, but you still function the same, are you still “you”? If not, at what point did you stop?
Have you seen Yudkowsky’s series of posts on this?
I’m familiar with the concept, not his specific essays, and, indeed, this literally does happen. Our neurons are constantly making and unmaking largely proteinaceous components of themselves and, over a lifetime, there is a no doubt partial, perhaps complete, turnover of the brain’s atoms. I find this not problematic at all because the electrochemical processes which create consciousness go on undisturbed. The idea that really queers my argument for me is that of datum-by-datum transfer, each new datum, in the form of neuronal activity (the 1s and 0s of electrical discharge or non-) is started by my current brain but saved in another. Knee-jerk, I would tend to say that transfer is a complete one and myself has been maintained. The problem comes when a copy, run in parallel until completed and then cloven (good luck untying that knot), rather than a transfer is made by the exact same datum-by-datum process. At the end I have 2 beings who seem to meet my definition of Me.
However, this argument does not convince me of the contrary position of the sameness of self and copy, and it does nothing to make me care about a me-like copy coming into being a thousand years from now, and does not induce me to step up onto Dr Bowie-Tesla’s machine.
At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago? How about one who appears 1,000,000 years from now? The exact eyes that see these words will be the ones in the water. I can’t come up with any answer other that “fuck that guy”. I might just be a glass-half empty kind of guy, but someone always ends up stuck in the meat, and it’s going to be that being which remains behind these eyes.
Note that you can believe everyone involved is “you”, and yet not care about them. The two questions aren’t completely orthogonal, but identifying someone with yourself doesn’t imply you should care about them.
At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago?
The same price I would accept to have the last second erased from my memory, but first feel the pain of drowning. That’s actually not so easy to set. I’m not sure how much it would cost to get me to accept X amount of pain plus removal of the memory, but it’s probably less than the cost for X amount of pain alone.
How about one who appears 1,000,000 years from now?
That’s like removing the last second of memory, plus pain, plus jumping forward in time. I’d probably only do it if I had a guarantee that I’d survive and be able to get used to whatever goes on in the future and be happy.
If I gradually replace every atom in your brain with a different one until you have no atoms left, but you still function the same, are you still “you”? If not, at what point did you stop?
Have you seen Yudkowsky’s series of posts on this?
I’m familiar with the concept, not his specific essays, and, indeed, this literally does happen. Our neurons are constantly making and unmaking largely proteinaceous components of themselves and, over a lifetime, there is a no doubt partial, perhaps complete, turnover of the brain’s atoms. I find this not problematic at all because the electrochemical processes which create consciousness go on undisturbed. The idea that really queers my argument for me is that of datum-by-datum transfer, each new datum, in the form of neuronal activity (the 1s and 0s of electrical discharge or non-) is started by my current brain but saved in another. Knee-jerk, I would tend to say that transfer is a complete one and myself has been maintained. The problem comes when a copy, run in parallel until completed and then cloven (good luck untying that knot), rather than a transfer is made by the exact same datum-by-datum process. At the end I have 2 beings who seem to meet my definition of Me.
However, this argument does not convince me of the contrary position of the sameness of self and copy, and it does nothing to make me care about a me-like copy coming into being a thousand years from now, and does not induce me to step up onto Dr Bowie-Tesla’s machine.
At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago? How about one who appears 1,000,000 years from now? The exact eyes that see these words will be the ones in the water. I can’t come up with any answer other that “fuck that guy”. I might just be a glass-half empty kind of guy, but someone always ends up stuck in the meat, and it’s going to be that being which remains behind these eyes.
Can you read http://lesswrong.com/lw/qp/timeless_physics/, http://lesswrong.com/lw/qx/timeless_identity/, and http://lesswrong.com/lw/qy/why_quantum/, with any relevant posts linked therein? (Or just start at the beginning of the quantum sequence.)
Note that you can believe everyone involved is “you”, and yet not care about them. The two questions aren’t completely orthogonal, but identifying someone with yourself doesn’t imply you should care about them.
The same price I would accept to have the last second erased from my memory, but first feel the pain of drowning. That’s actually not so easy to set. I’m not sure how much it would cost to get me to accept X amount of pain plus removal of the memory, but it’s probably less than the cost for X amount of pain alone.
That’s like removing the last second of memory, plus pain, plus jumping forward in time. I’d probably only do it if I had a guarantee that I’d survive and be able to get used to whatever goes on in the future and be happy.