I shifted to the view that there’s probably no persisting identity over time anyway and in some sense I probably die and get reborn all the time in any case.
I think this is just wrong; there is personal identity in the form of preserved information that links your brain at time t to your brain at time t+1 as the only other physical system that has extremely similar information content (memories, personality, etc).
I said “in some sense”, which grants the possibility that there is also a sense in which personal identity does exist.
I think the kind of definition that you propose is valid but not emotionally compelling in the same way as my old intuitive sense of personal identity was.
It also doesn’t match some other intuitive senses of personal identity, e.g. if you managed to somehow create an identical copy of me then it implies that I should be indifferent to whether I or my copy live. But if that happened, I suspect that both of my instances would prefer to be the ones to live.
if you managed to somehow create an identical copy of me then it implies that I should be indifferent to whether I or my copy live. But if that happened, I suspect that both of my instances would prefer to be the ones to live.
I suspect that a short, private conversation with your copy would change your mind. The other thing here is that 1 is far from the ideal number of copies of you—you’d probably be extremely happy to live with other copies of yourself up to a few thousand or something. So going from 2 to 1 is a huge loss to your copy-clan’s utility.
I suspect that a short, private conversation with your copy would change your mind
Can you elaborate how?
E.g. suppose that it was the case that I would get copied, and then one of us would be chosen by lot to be taken in front of a firing squad while the other could continue his life freely. I expect—though of course it’s hard to fully imagine this kind of a hypothetical—that the thought of being taken in front of that firing squad and never seeing any of my loved ones again would create a rather visceral sense of terror in me. Especially if I was given a couple of days for the thought to sink in, and I wouldn’t just be in a sudden shock of “wtf is happening”.
It’s possible that the thought of an identical copy of me being out there in the world would bring some comfort to that, but mostly I don’t see how any conversation would have a chance of significantly nudging those reactions. They seem much too primal and low-level for that.
Here is some FB discussion about some pragmatics of when you could have many clones of yourself. The specific hypothetical setup has the advantage of people thinking about what they would do themselves specifically:
What do you think of the view of identity suggested in Greg Egan’s “Schild’s Ladder”, which gives the novel its name?
I’d sketch what that view is, but not without rereading the book and not while sitting in a cafe poking at a screen keyboard. Meanwhile, an almond croissant had a persistent identity from when it came out of an oven until it was torn apart and crushed by my teeth, because I desired to use its atoms to sustain my own identity.
I have read that book, but it’s been long enough that I don’t really remember anything about it.
Though I would guess that if you were to describe it, my reaction would be something along the lines of “if you want to have a theory of identity, sounds as as valid as any other”.
Schild’s ladder is a geometric construction to show how to transport a vector over a curved manifold. In the book, one of the characters as a nine-year-old boy knowing that he can expect to live indefinitely—longer than the stars, even—is afraid of the prospect, wondering how he will know that he will still be himself and isn’t going to turn into someone else. His father explains Schild’s Ladder to him, as a metaphor, or more than a metaphor, for how each day, you can take the new experiences of that day to update your self in the way truest to your previous self.
On a curved manifold, where you end up pointing will depend on the route you took to get there. However:
“You’ll never stop changing, but that doesn’t mean you have to drift in the wind. Every day, you can take the person you’ve been and the new things you’ve witnessed and make your own, honest choice as to who you should become.
“Whatever happens, you can always be true to yourself. But don’t expect to end up with the same inner compass as everyone else. Not unless they started beside you and climbed beside you every step of the way.”
So, historical continuity, but more than just that. Not an arbitrary continuous path through person-space, but parallel transport along a path.
I don’t fully understand the actual math of it so I probably am not fully getting it. But if the core idea is something like “you can at every timestep take new experiences and then choose how to integrate them into a new you, with the particulars of that choice (and thus the nature of the new you) drawing on everything that you are at that timestep”, then I like it.
I might quibble a bit about the extent to which something like that is actually a conscious choice, but if the “you” in question is thought to be all of your mind (subconsciousness and all) then that fixes it. Plus making it into more of a conscious choice over time feels like a neat aspirational goal.
… now I do feel more of a desire to live some several hundred years in order to do that, actually.
I think this is just wrong; there is personal identity in the form of preserved information that links your brain at time t to your brain at time t+1 as the only other physical system that has extremely similar information content (memories, personality, etc).
I said “in some sense”, which grants the possibility that there is also a sense in which personal identity does exist.
I think the kind of definition that you propose is valid but not emotionally compelling in the same way as my old intuitive sense of personal identity was.
It also doesn’t match some other intuitive senses of personal identity, e.g. if you managed to somehow create an identical copy of me then it implies that I should be indifferent to whether I or my copy live. But if that happened, I suspect that both of my instances would prefer to be the ones to live.
I suspect that a short, private conversation with your copy would change your mind. The other thing here is that 1 is far from the ideal number of copies of you—you’d probably be extremely happy to live with other copies of yourself up to a few thousand or something. So going from 2 to 1 is a huge loss to your copy-clan’s utility.
Can you elaborate how?
E.g. suppose that it was the case that I would get copied, and then one of us would be chosen by lot to be taken in front of a firing squad while the other could continue his life freely. I expect—though of course it’s hard to fully imagine this kind of a hypothetical—that the thought of being taken in front of that firing squad and never seeing any of my loved ones again would create a rather visceral sense of terror in me. Especially if I was given a couple of days for the thought to sink in, and I wouldn’t just be in a sudden shock of “wtf is happening”.
It’s possible that the thought of an identical copy of me being out there in the world would bring some comfort to that, but mostly I don’t see how any conversation would have a chance of significantly nudging those reactions. They seem much too primal and low-level for that.
Here is some FB discussion about some pragmatics of when you could have many clones of yourself. The specific hypothetical setup has the advantage of people thinking about what they would do themselves specifically:
https://www.facebook.com/duncan.sabien/posts/pfbid0Ps7QFKFvCHWSV1MUNWLhegXN9MN8Le4MX6k3PHmhDHLfFsSFNc194TYvH1vWCXzbl
What do you think of the view of identity suggested in Greg Egan’s “Schild’s Ladder”, which gives the novel its name?
I’d sketch what that view is, but not without rereading the book and not while sitting in a cafe poking at a screen keyboard. Meanwhile, an almond croissant had a persistent identity from when it came out of an oven until it was torn apart and crushed by my teeth, because I desired to use its atoms to sustain my own identity.
I have read that book, but it’s been long enough that I don’t really remember anything about it.
Though I would guess that if you were to describe it, my reaction would be something along the lines of “if you want to have a theory of identity, sounds as as valid as any other”.
Schild’s ladder is a geometric construction to show how to transport a vector over a curved manifold. In the book, one of the characters as a nine-year-old boy knowing that he can expect to live indefinitely—longer than the stars, even—is afraid of the prospect, wondering how he will know that he will still be himself and isn’t going to turn into someone else. His father explains Schild’s Ladder to him, as a metaphor, or more than a metaphor, for how each day, you can take the new experiences of that day to update your self in the way truest to your previous self.
On a curved manifold, where you end up pointing will depend on the route you took to get there. However:
“You’ll never stop changing, but that doesn’t mean you have to drift in the wind. Every day, you can take the person you’ve been and the new things you’ve witnessed and make your own, honest choice as to who you should become.
“Whatever happens, you can always be true to yourself. But don’t expect to end up with the same inner compass as everyone else. Not unless they started beside you and climbed beside you every step of the way.”
So, historical continuity, but more than just that. Not an arbitrary continuous path through person-space, but parallel transport along a path.
I don’t fully understand the actual math of it so I probably am not fully getting it. But if the core idea is something like “you can at every timestep take new experiences and then choose how to integrate them into a new you, with the particulars of that choice (and thus the nature of the new you) drawing on everything that you are at that timestep”, then I like it.
I might quibble a bit about the extent to which something like that is actually a conscious choice, but if the “you” in question is thought to be all of your mind (subconsciousness and all) then that fixes it. Plus making it into more of a conscious choice over time feels like a neat aspirational goal.
… now I do feel more of a desire to live some several hundred years in order to do that, actually.