As we move farther away it becomes harder to self-identify. I have some difficulty self-identifying with an upload but there’s still a fair bit. I have a lot of trouble identifying with a beta copy. In fact, I imagine that an accurate beta version of me would spend a lot of time simply worrying about whether it is me. (Of course now putting that in writing means that out super friendly AI will likely only make a beta of me that does do that worrying is potentially counterproductive. There’s a weird issue here where the more the beta is like me the more it will worry about this. So a beta that was just like me but didn’t worry would certainly not be close enough in behavior for me to self-identify with it. So such worrying in a beta is evidence that I should make the self-identification with that beta.)
In fact, I imagine that an accurate beta version of me would spend a lot of time simply worrying about whether it is me.
You will get over it fast. Ever experienced a few hours of memory loss of found some old writing of yours that you can not recall? The person you are changes all the time. You will be happy to be there, and probably find lots of writing about how/why an upload you enough by other minds.
You will get over it fast. Ever experienced a few hours of memory loss of found some old writing of yours that you can not recall?
Yes. And this bothers me a bit. Sometimes I even worry (a teeny tiny bit) that I’ve somehow switched into a parallel universe where I’ve wrote slightly different things (or in some cases really dumb things). When I was younger I sometimes had borderline panic attacks about how if I didn’t remember some event how could I identify as that individual? I suspect that I’m a little less psychologically balanced about these sorts of issues than most people...
I am deeply frightened by the fact that most important developments in my live are accidents. But well, there is little use in being afraid.
You could try to figure out how much you actually change from time unit to time unit by journaling, or by tracking mental changes. Maybe you also find a technique that can be adapted to measure your discomfort and to try out ways to reduce it.
I externalize some of my brainpower into note-files, with some funny results.
One way to resolve the dissonance this produces is to quit identifying with yourself from one point in time to the next. Me-from-yesterday can be seen as a different self (though sharing most identity characteristics like values, goals, memories, etc.) from me-from-today.
That’s an interesting way of thinking about it. My take on it is the opposite. If an accurate copy of me was made after my death, I am pretty sure the copy wouldn’t care if it was me or not, just as I don’t care if I am as my past self wished me to be. If the copy was convinced it was me, there would be no problem. If it was convinced it wasn’t, than it wouldn’t think of my death as any more important than the deaths of everyone else throughout history.
As we move farther away it becomes harder to self-identify. I have some difficulty self-identifying with an upload but there’s still a fair bit. I have a lot of trouble identifying with a beta copy. In fact, I imagine that an accurate beta version of me would spend a lot of time simply worrying about whether it is me. (Of course now putting that in writing means that out super friendly AI will likely only make a beta of me that does do that worrying is potentially counterproductive. There’s a weird issue here where the more the beta is like me the more it will worry about this. So a beta that was just like me but didn’t worry would certainly not be close enough in behavior for me to self-identify with it. So such worrying in a beta is evidence that I should make the self-identification with that beta.)
You will get over it fast. Ever experienced a few hours of memory loss of found some old writing of yours that you can not recall? The person you are changes all the time. You will be happy to be there, and probably find lots of writing about how/why an upload you enough by other minds.
Yes. And this bothers me a bit. Sometimes I even worry (a teeny tiny bit) that I’ve somehow switched into a parallel universe where I’ve wrote slightly different things (or in some cases really dumb things). When I was younger I sometimes had borderline panic attacks about how if I didn’t remember some event how could I identify as that individual? I suspect that I’m a little less psychologically balanced about these sorts of issues than most people...
Most people maybe, but not most LWers I bet. I have had such attacks too...
I am deeply frightened by the fact that most important developments in my live are accidents. But well, there is little use in being afraid.
You could try to figure out how much you actually change from time unit to time unit by journaling, or by tracking mental changes. Maybe you also find a technique that can be adapted to measure your discomfort and to try out ways to reduce it.
I externalize some of my brainpower into note-files, with some funny results.
One way to resolve the dissonance this produces is to quit identifying with yourself from one point in time to the next. Me-from-yesterday can be seen as a different self (though sharing most identity characteristics like values, goals, memories, etc.) from me-from-today.
I dislike this concept, but that is what with. Identity breaks down, and personhood ends.
That’s an interesting way of thinking about it. My take on it is the opposite. If an accurate copy of me was made after my death, I am pretty sure the copy wouldn’t care if it was me or not, just as I don’t care if I am as my past self wished me to be. If the copy was convinced it was me, there would be no problem. If it was convinced it wasn’t, than it wouldn’t think of my death as any more important than the deaths of everyone else throughout history.