...and therefore ever being sure that a different instantiation of me has the same utility to me as myself.
...and therefore having to care about what some future diety might or might not claim to do to/for 10^^10 “copies” of me (not that I did anyway, because while inferential distance should not disqualify a conclusion, it damn well should discount it)
The proposition that instantiations of me occasionally appear out of nowhere at random points in the universe and usually die horribly soon after (which I suspected had a hidden error somewhere from the moment I first heard it).
Unfortunately, this also probably means I should write into my will stipulations against destructive uploading and pray that the overconfident patternists at whose mercy I’ll probably be will find it cost-effective to respect my wishes.
No. We ruled out perfect quantum-level copies, which does not rule out near-copies being morally relevant.
For example, you plus one second is completely different on the quantum level due to uninteresting things like thermal noise, but they are just as important as you are. Likewise, a molecule-level copy of you would be pretty much the same (at least physically) as movement through time (to within the tolerances we normally deal with), as would, I suspect, cell-level plus a few characterization parameters like “connection strength” and “activation level”, and I bet you could even drop activation level (as happens in sleep) and many low-level details like exact cell arrangement.
Basically, humans are not exactly isolated 0.01 Kelvin quantum computers (and even they have decoherence time much less than a second), so if you want exact continuity, you already don’t have it. You have to generalize your moral intuitions to approximate continuity just to accept normal things like existing at 310 K, sneezes destroying thousands of brain cells, sleep rebooting everything, and random chemicals influencing your cognition. Many people who do so decide that the details of the computational substrate don’t really matter; it’s the high level behaviors that matter. Hence patternism.
I’d be interested in whether you still disagree and why.
I’d be interested in whether you still disagree and why.
I’m still trying to figure out what does and does not matter to my concept of continuity and why.
Let me ask you this: which person from five seconds in the future do you care more about protecting: yourself or a random person similar to you?
Is there some theoretical threshold of similarity between yourself and another person beyond which you will no longer be sure which location will be occupied by the brain-state your current brain-state will evolve to in the next second?
Let me ask you this: which person from five seconds in the future do you care more about protecting: yourself or a random person similar to you?
All else, equal, I think I prefer my own future self. It depends how similar, though. If this “random” person were in fact more similar to myself than I currently am, I’d prefer them. As for what I mean by “more similar to myself than I currently am”, I’m just leaving open the possibility that there are things that make my default future self not the optimum in terms of continuity. For example, what if this other person remembers things that I have forgotten?
The way I think of this, there isn’t really a fundamental concept of continuity, and the question really is “What kind of processes do I want the universe to be turned into?” There’s no fundamental concept of “me”, it’s just that I prefer the future to contain people who have property X, and Y, as opposed to property W and Z.
Likewise, there is no fundamental concept of “person” apart from the structure of our preferences over what gets done with this computational substrate we have.
Is there some theoretical threshold of similarity between yourself and another person beyond which you will no longer be sure which location will be occupied by the brain-state your current brain-state will evolve to in the next second?
I’m not sure that makes sense. I certainly have intuitions about “continuity”, but they might be broken in cases like that. For a question like that, I think talking about continuity has to be replaced with the more general question of what I want the future to look like.
(A future in which there are two of me? Yes please. As for which is the “real me”, who cares?)
Thanks for the link.
Well, then, so much for the following:
Perfect simulations
...and therefore ever being sure that a different instantiation of me has the same utility to me as myself.
...and therefore having to care about what some future diety might or might not claim to do to/for 10^^10 “copies” of me (not that I did anyway, because while inferential distance should not disqualify a conclusion, it damn well should discount it)
The proposition that instantiations of me occasionally appear out of nowhere at random points in the universe and usually die horribly soon after (which I suspected had a hidden error somewhere from the moment I first heard it).
Unfortunately, this also probably means I should write into my will stipulations against destructive uploading and pray that the overconfident patternists at whose mercy I’ll probably be will find it cost-effective to respect my wishes.
No. We ruled out perfect quantum-level copies, which does not rule out near-copies being morally relevant.
For example, you plus one second is completely different on the quantum level due to uninteresting things like thermal noise, but they are just as important as you are. Likewise, a molecule-level copy of you would be pretty much the same (at least physically) as movement through time (to within the tolerances we normally deal with), as would, I suspect, cell-level plus a few characterization parameters like “connection strength” and “activation level”, and I bet you could even drop activation level (as happens in sleep) and many low-level details like exact cell arrangement.
Basically, humans are not exactly isolated 0.01 Kelvin quantum computers (and even they have decoherence time much less than a second), so if you want exact continuity, you already don’t have it. You have to generalize your moral intuitions to approximate continuity just to accept normal things like existing at 310 K, sneezes destroying thousands of brain cells, sleep rebooting everything, and random chemicals influencing your cognition. Many people who do so decide that the details of the computational substrate don’t really matter; it’s the high level behaviors that matter. Hence patternism.
I’d be interested in whether you still disagree and why.
I’m still trying to figure out what does and does not matter to my concept of continuity and why.
Let me ask you this: which person from five seconds in the future do you care more about protecting: yourself or a random person similar to you?
Is there some theoretical threshold of similarity between yourself and another person beyond which you will no longer be sure which location will be occupied by the brain-state your current brain-state will evolve to in the next second?
All else, equal, I think I prefer my own future self. It depends how similar, though. If this “random” person were in fact more similar to myself than I currently am, I’d prefer them. As for what I mean by “more similar to myself than I currently am”, I’m just leaving open the possibility that there are things that make my default future self not the optimum in terms of continuity. For example, what if this other person remembers things that I have forgotten?
The way I think of this, there isn’t really a fundamental concept of continuity, and the question really is “What kind of processes do I want the universe to be turned into?” There’s no fundamental concept of “me”, it’s just that I prefer the future to contain people who have property X, and Y, as opposed to property W and Z.
Likewise, there is no fundamental concept of “person” apart from the structure of our preferences over what gets done with this computational substrate we have.
I’m not sure that makes sense. I certainly have intuitions about “continuity”, but they might be broken in cases like that. For a question like that, I think talking about continuity has to be replaced with the more general question of what I want the future to look like.
(A future in which there are two of me? Yes please. As for which is the “real me”, who cares?)