At time T1, I begin replacing my meat with cloud. At T2, I complete that process. At T3, I make a copy of my cloud-self.
Is it your intuition that that third step ought to fail? If so, can you unpack that intuition?
If you think that third step can succeed, do you have the same problem? That is, if I can have two copies of my cloud-self running simultaneously, do you not get what that would be like?
My answer to what that would be like is it would be just like this. That is, if you make a cloud-copy of me while I’m sleeping, I wouldn’t know it, and the existence of that cloud-copy wouldn’t in any way impinge on my experience of the world. Also, I would wake up in the cloud, and the existence of my meat body would not in any way impinge on my experience of the world. There’s just two entities, both of which are me.
I guess I have similar problems with the third step. I’m really sorry if it seems like I’m just refusing to update, and thanks a bunch; that last part really did help. But consider the following:
Last night, somebody made a cloud-copy of you, but you don’t know it. In a few hours, that person comes and kills you (maybe you’re asleep when he does it, but I don’t think that really matters).
Isn’t that still like dying? I know that to the world it’s the same, but from the inside, it’s death, right?. Have you read HPMoR? Fred and George are basically alternate copies of the same brain. If you were Fred, wouldn’t you rather not die, even though you would still have George!you alive and well?
It’s not a problem; this idea is genuinely counterintuitive when first encountered.
The reason it’s counterintuitive is that you’re accustomed to associating “ahartell” with a single sequence of connected observer-moments. Which makes sense: in the real world, it’s always been like that. But in this hypothetical world there are two such sequences, unrelated to one another, and they are both “ahartell.” That’s completely unlike anything in your real experience, and the consequences of it are legitimately counterintuitive; if you want to understand them you have to be willing to set those intuitions aside.
One consequence is that you can both live and die simultaneously. That is, if there are two ahartells (call them A and 1) and A dies, then you die; it’s a real death, just as real as any other death. If 1 survives, then you survive; it’s a real survival, just as real as any other survival. The fact that both of these things happen at once is counterintuitive, because it doesn’t ever come up in the real world, but it is a natural consequence of that hypothetical scenario.
Similarly, another consequence is that you can die twice. That is, if A and 1 both die, those are two independent deaths, each as real as any other death.
And another consequence is that you can live twice. That is, if A and 1 both survive, they are two independent lives; A is not aware of 1, 1 is not aware of A. A and 1 are different people, but they are both you.
Again, weird and counterintuitive, but a natural consequence of a weird and counterintuitive situation.
1) You are Fred (of HPMOR’s Fred & George, who for this we’ll assume are perfect copies). Voldemort comes up to you and George and says he will kill one of you. If he kills George, you live and nothing else happens. If he kills you, George lives and gets a dollar. Would you choose to allow you!Fred to die? And not just as the sacrifice you know it’s reasonable to make in terms of total final utility but as the obvious correct choice from your perspective. (If the names are a problem, assume somebody makes a copy of you and immediately asks you this question.)
2) If all else is equal, would you rather have N*X copies than X copies for all positive values of X and all positive and greater than 1 values of N? (I don’t know why I worded that like that. Would you rather have more copies than less for all values of more and less?)
3) You go to make copies of yourself for the first time. You have $100, with which you can pay for either 1 copy or 100 copies (with a small caveat). If you choose 100 copies, each copy’s life is 10% less good, and the life of original/biological!you will be 20% less good (the copy maker is a magical wizard that can do things like this and likes to make people think). Do you choose the 100 copies? And do you think that it is obviously better and one would be stupid to choose otherwise?
Re: #1… there are all kinds of emotional considerations here, of course; I really don’t know what I would do, much as I don’t know what I would do given a similar deal involving my real-life brother or husband. But if I leave all of that aside, and I also ignore total expected utility calculations, then I prefer to continue living and let my copy die.
Re: #2… within some range of Ns where there aren’t significant knock-on effects unrelated to what I think you’re getting at (e.g., creating too much competition for the things I want, losing the benefits of cooperation among agents with different comparative advantages, etc.), I prefer that N+1 copies of me exist than N copies. More generally, I prefer the company of people similar to me, and I prefer that there be more agents trying to achieve the things I want more of in the world.
Re: #3… I’m not sure. My instinct is to make just one copy rather than accept the 20% penalty in quality of life, but it’s not an easy choice; I acknowledge that I ought to value the hundred copies more.
I’m not trying to back you into a corner, but it seems like your responses to #1 and #3 indicate that you value the original more than the others, which seems to imply that the copies would be less you. From your answer to #2, I came up with another question. Would you value uploading and copying just as much if somehow the copies were P-zombies? It seems like your answers to #1-3 would be the same in that case.
I don’t value the original over the others, but I do value me over not-me (even in situations where I can’t really justify that choice beyond pure provincialism).
A hypothetical copy of me created in the future is just as much me (hypothetically) as the actual future me is, but a hypothetical already-created copy of a past me is not me. The situation is perfectly symmetrical; if someone makes a copy of me and asks the copy the question in #1, I give the same answer as when they ask the original.
I have trouble answering the P-zombie question, since I consider P-zombies an incoherent idea. I mean, if I can’t tell the difference between P-zombies and genuine people, then I react to my copies just the same as if they were genuine people… how could I do anything else?
Gotcha. OK, try this then:
At time T1, I begin replacing my meat with cloud.
At T2, I complete that process.
At T3, I make a copy of my cloud-self.
Is it your intuition that that third step ought to fail? If so, can you unpack that intuition?
If you think that third step can succeed, do you have the same problem? That is, if I can have two copies of my cloud-self running simultaneously, do you not get what that would be like?
My answer to what that would be like is it would be just like this. That is, if you make a cloud-copy of me while I’m sleeping, I wouldn’t know it, and the existence of that cloud-copy wouldn’t in any way impinge on my experience of the world. Also, I would wake up in the cloud, and the existence of my meat body would not in any way impinge on my experience of the world. There’s just two entities, both of which are me.
I guess I have similar problems with the third step. I’m really sorry if it seems like I’m just refusing to update, and thanks a bunch; that last part really did help. But consider the following:
Isn’t that still like dying? I know that to the world it’s the same, but from the inside, it’s death, right?. Have you read HPMoR? Fred and George are basically alternate copies of the same brain. If you were Fred, wouldn’t you rather not die, even though you would still have George!you alive and well?
It’s not a problem; this idea is genuinely counterintuitive when first encountered.
The reason it’s counterintuitive is that you’re accustomed to associating “ahartell” with a single sequence of connected observer-moments. Which makes sense: in the real world, it’s always been like that. But in this hypothetical world there are two such sequences, unrelated to one another, and they are both “ahartell.” That’s completely unlike anything in your real experience, and the consequences of it are legitimately counterintuitive; if you want to understand them you have to be willing to set those intuitions aside.
One consequence is that you can both live and die simultaneously. That is, if there are two ahartells (call them A and 1) and A dies, then you die; it’s a real death, just as real as any other death. If 1 survives, then you survive; it’s a real survival, just as real as any other survival. The fact that both of these things happen at once is counterintuitive, because it doesn’t ever come up in the real world, but it is a natural consequence of that hypothetical scenario.
Similarly, another consequence is that you can die twice. That is, if A and 1 both die, those are two independent deaths, each as real as any other death.
And another consequence is that you can live twice. That is, if A and 1 both survive, they are two independent lives; A is not aware of 1, 1 is not aware of A. A and 1 are different people, but they are both you.
Again, weird and counterintuitive, but a natural consequence of a weird and counterintuitive situation.
Ok, three more questions/scenarios.
1) You are Fred (of HPMOR’s Fred & George, who for this we’ll assume are perfect copies). Voldemort comes up to you and George and says he will kill one of you. If he kills George, you live and nothing else happens. If he kills you, George lives and gets a dollar. Would you choose to allow you!Fred to die? And not just as the sacrifice you know it’s reasonable to make in terms of total final utility but as the obvious correct choice from your perspective. (If the names are a problem, assume somebody makes a copy of you and immediately asks you this question.)
2) If all else is equal, would you rather have N*X copies than X copies for all positive values of X and all positive and greater than 1 values of N? (I don’t know why I worded that like that. Would you rather have more copies than less for all values of more and less?)
3) You go to make copies of yourself for the first time. You have $100, with which you can pay for either 1 copy or 100 copies (with a small caveat). If you choose 100 copies, each copy’s life is 10% less good, and the life of original/biological!you will be 20% less good (the copy maker is a magical wizard that can do things like this and likes to make people think). Do you choose the 100 copies? And do you think that it is obviously better and one would be stupid to choose otherwise?
Thanks.
Re: #1… there are all kinds of emotional considerations here, of course; I really don’t know what I would do, much as I don’t know what I would do given a similar deal involving my real-life brother or husband. But if I leave all of that aside, and I also ignore total expected utility calculations, then I prefer to continue living and let my copy die.
Re: #2… within some range of Ns where there aren’t significant knock-on effects unrelated to what I think you’re getting at (e.g., creating too much competition for the things I want, losing the benefits of cooperation among agents with different comparative advantages, etc.), I prefer that N+1 copies of me exist than N copies. More generally, I prefer the company of people similar to me, and I prefer that there be more agents trying to achieve the things I want more of in the world.
Re: #3… I’m not sure. My instinct is to make just one copy rather than accept the 20% penalty in quality of life, but it’s not an easy choice; I acknowledge that I ought to value the hundred copies more.
I’m not trying to back you into a corner, but it seems like your responses to #1 and #3 indicate that you value the original more than the others, which seems to imply that the copies would be less you. From your answer to #2, I came up with another question. Would you value uploading and copying just as much if somehow the copies were P-zombies? It seems like your answers to #1-3 would be the same in that case.
Thanks for being so accommodating, really.
I don’t value the original over the others, but I do value me over not-me (even in situations where I can’t really justify that choice beyond pure provincialism).
A hypothetical copy of me created in the future is just as much me (hypothetically) as the actual future me is, but a hypothetical already-created copy of a past me is not me. The situation is perfectly symmetrical; if someone makes a copy of me and asks the copy the question in #1, I give the same answer as when they ask the original.
I have trouble answering the P-zombie question, since I consider P-zombies an incoherent idea. I mean, if I can’t tell the difference between P-zombies and genuine people, then I react to my copies just the same as if they were genuine people… how could I do anything else?
Thanks. It makes sense (ish) and you’ve either convinced me or convinced me that you’ve convinced me ;).