Note that you can believe everyone involved is “you”, and yet not care about them. The two questions aren’t completely orthogonal, but identifying someone with yourself doesn’t imply you should care about them.
At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago?
The same price I would accept to have the last second erased from my memory, but first feel the pain of drowning. That’s actually not so easy to set. I’m not sure how much it would cost to get me to accept X amount of pain plus removal of the memory, but it’s probably less than the cost for X amount of pain alone.
How about one who appears 1,000,000 years from now?
That’s like removing the last second of memory, plus pain, plus jumping forward in time. I’d probably only do it if I had a guarantee that I’d survive and be able to get used to whatever goes on in the future and be happy.
Can you read http://lesswrong.com/lw/qp/timeless_physics/, http://lesswrong.com/lw/qx/timeless_identity/, and http://lesswrong.com/lw/qy/why_quantum/, with any relevant posts linked therein? (Or just start at the beginning of the quantum sequence.)
Note that you can believe everyone involved is “you”, and yet not care about them. The two questions aren’t completely orthogonal, but identifying someone with yourself doesn’t imply you should care about them.
The same price I would accept to have the last second erased from my memory, but first feel the pain of drowning. That’s actually not so easy to set. I’m not sure how much it would cost to get me to accept X amount of pain plus removal of the memory, but it’s probably less than the cost for X amount of pain alone.
That’s like removing the last second of memory, plus pain, plus jumping forward in time. I’d probably only do it if I had a guarantee that I’d survive and be able to get used to whatever goes on in the future and be happy.