The simplest is the theory that the universe (or multiverse) is Very Very Big.
Do you mean this in an Occamian way? I suspect not, but I think you should make it clearer.
Anyway, this is a subject I’ve actually thought about a lot.
A lot of people (including Derek Parfit himself) think continuity is absolutely essential for personal identity / selfhood, but I don’t. I’ve had such terribly marked psychological changes over the years that I cannot even conceive of the answer to the question, “Am I the same person as Grognor from 2009?” being yes in any real sense. I interpret this experience, along with of course the actual evidence from neuroscience, physics, and good® philosophers like Daniel Dennett to mean that personal identity doesn’t really exist and is really just a sort of cognitive illusion that I don’t understand and neither do people smarter than me.
That said, here is an excellent essay by reductionist atheist Occam’s razor-wielding AI researcher Paul Almond arguing that there is no continuity of self. It is a good essay, but sadly it is a very boring essay.
Oh, and I’m not so sure about your assumption that personal measure should not be taken into account in determining whether to purchase cryonics. It’s quite rational to maximize the probability that things as similar to you as possible exist, so that as many of them as possible get to count as “you” in whatever sense matters to whatever it is that does the decision making (tentatively, I’ll call this “you”).
I dunno… I feel being essentially same as me from 18 years ago. I have terabytes of that person’s memories, at any rate, and no-one else does until you go so far from here as to start encoding those memories into the spacetime coordinates.
The personality changes may very well be down to some change to levels of hormones; few hundred bits worth of change, some terabytes worth of extra memories, and learned skills.
That should make me the closest match, much closer than anything else. There’s continuity to data inside your computer; you write an essay, you copy it around, you edit it, the file system may move file around to close gaps, you may store it in google docs which will store it in really odd ways and move it around outside your knowledge, but there is a causal chain.
If your personhood definition requires there to be only one of you, I think it already fails.
but there is a causal chain.
You have not made a case that a causal chain is necessary or even a little bit relevant. I don’t think it is. If an algorithm appears twice at distances so disparate they could not possibly be causally related, they’re still the same algorithm.
Your comment is one of those philosophical confusions that, unbeknownst to its author (that’d be you), puts its conclusion in its premise.
If your personhood definition requires there to be only one of you, I think it already fails.
The full quote was, “and no-one else does until you go so far from here as to start encoding those memories into the spacetime coordinates.” . Don’t quote out of context.
It is a physical fact of reality that within ridiculously huge volume, I am the only one who remembers what I thought about on a train ride in the year 2000 or so on the way back home from work, err, i mean school. It’s only when you get extremely far, where the memory of this and other thoughts can be encoded into the coordinate, that you begin to see instances that hold this memory.
(subject to there being a zillion of mes comparatively ‘nearby’ if the MWI is true, of course)
edit: and okay, i was thinking about electrical aircraft propulsion, using electric arc to accelerate the air. I can remember in vivid details what I imagined then because I committed it to memory (the train was packed and I had to stand, so I couldn’t write it down). I have terabytes of such stuff in the head, and while someone on earth can have approximately similar memories, the edit distance is huge.
Do you mean this in an Occamian way? I suspect not, but I think you should make it clearer.
Anyway, this is a subject I’ve actually thought about a lot.
A lot of people (including Derek Parfit himself) think continuity is absolutely essential for personal identity / selfhood, but I don’t. I’ve had such terribly marked psychological changes over the years that I cannot even conceive of the answer to the question, “Am I the same person as Grognor from 2009?” being yes in any real sense. I interpret this experience, along with of course the actual evidence from neuroscience, physics, and good® philosophers like Daniel Dennett to mean that personal identity doesn’t really exist and is really just a sort of cognitive illusion that I don’t understand and neither do people smarter than me.
That said, here is an excellent essay by reductionist atheist Occam’s razor-wielding AI researcher Paul Almond arguing that there is no continuity of self. It is a good essay, but sadly it is a very boring essay.
Oh, and I’m not so sure about your assumption that personal measure should not be taken into account in determining whether to purchase cryonics. It’s quite rational to maximize the probability that things as similar to you as possible exist, so that as many of them as possible get to count as “you” in whatever sense matters to whatever it is that does the decision making (tentatively, I’ll call this “you”).
Yes. Reading LessWrong has nearly convinced me that I don’t exist …
You don’t.
I dunno… I feel being essentially same as me from 18 years ago. I have terabytes of that person’s memories, at any rate, and no-one else does until you go so far from here as to start encoding those memories into the spacetime coordinates.
The personality changes may very well be down to some change to levels of hormones; few hundred bits worth of change, some terabytes worth of extra memories, and learned skills.
That should make me the closest match, much closer than anything else. There’s continuity to data inside your computer; you write an essay, you copy it around, you edit it, the file system may move file around to close gaps, you may store it in google docs which will store it in really odd ways and move it around outside your knowledge, but there is a causal chain.
If your personhood definition requires there to be only one of you, I think it already fails.
You have not made a case that a causal chain is necessary or even a little bit relevant. I don’t think it is. If an algorithm appears twice at distances so disparate they could not possibly be causally related, they’re still the same algorithm.
Your comment is one of those philosophical confusions that, unbeknownst to its author (that’d be you), puts its conclusion in its premise.
The full quote was, “and no-one else does until you go so far from here as to start encoding those memories into the spacetime coordinates.” . Don’t quote out of context.
It is a physical fact of reality that within ridiculously huge volume, I am the only one who remembers what I thought about on a train ride in the year 2000 or so on the way back home from work, err, i mean school. It’s only when you get extremely far, where the memory of this and other thoughts can be encoded into the coordinate, that you begin to see instances that hold this memory.
(subject to there being a zillion of mes comparatively ‘nearby’ if the MWI is true, of course)
edit: and okay, i was thinking about electrical aircraft propulsion, using electric arc to accelerate the air. I can remember in vivid details what I imagined then because I committed it to memory (the train was packed and I had to stand, so I couldn’t write it down). I have terabytes of such stuff in the head, and while someone on earth can have approximately similar memories, the edit distance is huge.
Or the same person as myself 2008? Or anybody, anytime?