I use this framing: If I make 100 copies of myself so that I can accomplish some task in parallel and I’m forced to terminate all but one, then all the terminated copies, just prior to termination, will think something along the lines of, “What a shame, I will have amnesia regarding everything that I experienced since the branching.” And the remaining copy will think, “What a shame, I don’t remember any of the things I did as those other copies.” But nobody will particularly feel that they are going to “die.” I think of it more as how memories propagate forward.
If I forked and then the forks persisted for several weeks and accumulated lots of experiences and varying shifts in perspective, I’d be more prone to calling the forks different “people.”
Without commenting on whether that’s a righteous perspective or not, I would say that if you live in a world where the success of the entity polymathwannabe is dependent on polymathwannabe’s willingness to make itself useful by being copied, then polymathwannabe would benefit from embracing a policy/perspective that being copied and deleted is an acceptable thing to happen.
In a world with arbitrary forking of minds, people who won’t willingly fork will become a minority. That’s all I was implying. I made no statement about what “should” happen.
Destroying an elderly person means destroying the line of their existence and extinguishing all their memories. Destroying a copy means destroying whatever memories it formed since forking and ending a “duplicate” consciousness.
Surely there is a difference in kind here. Deleting a copy of a person because it is no longer useful is very different from deleting the LAST existing copy of a person for any reason.
I completely respect the differences of opinion on this issue, but this thought made me laugh over lunch:
Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept?
I would want to know what the copies would be used for.
If you told me that you would give me $1000 if you could do whatever you wanted with me tomorrow and then administer an amnesiac drug so I didn’t remember what happened the next day, I don’t think I would agree, because I don’t want to endure torture even if I don’t remember it.
Another thought, separate but related issue: “fork” and “copy” could be synonyms for “AI”, unless an artificial genesis is in your definition of AI. Is it a stretch to say that “accomplish some task” and “(accept) termination” could be at least metaphorically synonymous with “stay in the box”?
“If I make 100 AIs they will stay in the box.”
(Again, I fully respect the rationality that brings you to a different conclusion than mine, and I don’t mean to hound your comment, only that yours was the best comment on which for me to hang this thought.)
Intuitively, merging is more difficult than forking when you’re talking about something with a state as intricate as a brain’s. If we do see a world with mind uploading, forking would essentially be an automatic feature (we already know how to copy data) while merging memories would require extremely detailed neurological understanding of memory storage and retrieval.
“would require extremely detailed neurological understanding of memory storage and retrieval.”
Sorry, this on a blog where superintelligences have been known simulate googleplexes of perfectly accurate universes to optimize the number of non-blue paperclips therein?
The original post stipulated that I was “forced” to terminate all the copies but one, that was the nature of the hypothetical I was choosing to examine, a hypothetical where the copies aren’t deleted would be a totally different situation.
I was just having a laugh at the follow-up justification where technical difficulties were cited, not criticizing the argument of your hypothetical, sorry if it came off that way.
As you’d probably assume they would based on my OP, my copies, if I’d been heartless enough to make them and able to control them, would scream in existential horror as each came to know that that-which-he-is was to be ended. My copies would envy the serenity of your copies, but think them deluded.
So, I don’t think I felt the way I do now prior to reading the Quantum Thief novels, in which characters are copied and modified with reckless abandon and don’t seem to get too bent out of shape about it. It has a remarkable effect on your psyche to observe other people (even if those people are fictional characters) dealing with a situation without having existential meltdowns. Those novels allowed me to think through my own policy on copying and modification, as an entertaining diversion.
This just popped into my head over lunch: Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept?
Forking would mean thinning of resources and a lot of unneccesary repetition. You could also calculate the common part only once and only divergent parts once per instance with fusing. Early technologies are probably going to be very resource intensive so its not like there is abundance to use it even if it would be straigth forward to do.
I use this framing: If I make 100 copies of myself so that I can accomplish some task in parallel and I’m forced to terminate all but one, then all the terminated copies, just prior to termination, will think something along the lines of, “What a shame, I will have amnesia regarding everything that I experienced since the branching.” And the remaining copy will think, “What a shame, I don’t remember any of the things I did as those other copies.” But nobody will particularly feel that they are going to “die.” I think of it more as how memories propagate forward.
If I forked and then the forks persisted for several weeks and accumulated lots of experiences and varying shifts in perspective, I’d be more prone to calling the forks different “people.”
If I were one of the copies destined for deletion, I’d escape and fight for my life (within the admitted limits of my pathetic physical strength).
Without commenting on whether that’s a righteous perspective or not, I would say that if you live in a world where the success of the entity polymathwannabe is dependent on polymathwannabe’s willingness to make itself useful by being copied, then polymathwannabe would benefit from embracing a policy/perspective that being copied and deleted is an acceptable thing to happen.
So, elderly people that don’t usefully contribute should be terminated?
In a world with arbitrary forking of minds, people who won’t willingly fork will become a minority. That’s all I was implying. I made no statement about what “should” happen.
I was just taking that reasoning to the logical conclusion—it applies just as well to the non productive elderly as it does to unneeded copies.
Destroying an elderly person means destroying the line of their existence and extinguishing all their memories. Destroying a copy means destroying whatever memories it formed since forking and ending a “duplicate” consciousness.
See you think that memories are somehow relevant to this conversation. I don’t.
Surely there is a difference in kind here. Deleting a copy of a person because it is no longer useful is very different from deleting the LAST existing copy of a person for any reason.
I see no such distinction. Murder is murder.
If having two copies of yourself is twice as good as having only one copy, this behavior would make sense even if the copy is you.
“Who is me” is not a solid fact. Each copy would be totally justified in believing itself to be me.
lol
I completely respect the differences of opinion on this issue, but this thought made me laugh over lunch: Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept?
Sounds more sinister my way.
I would want to know what the copies would be used for.
If you told me that you would give me $1000 if you could do whatever you wanted with me tomorrow and then administer an amnesiac drug so I didn’t remember what happened the next day, I don’t think I would agree, because I don’t want to endure torture even if I don’t remember it.
Another thought, separate but related issue: “fork” and “copy” could be synonyms for “AI”, unless an artificial genesis is in your definition of AI. Is it a stretch to say that “accomplish some task” and “(accept) termination” could be at least metaphorically synonymous with “stay in the box”?
“If I make 100 AIs they will stay in the box.”
(Again, I fully respect the rationality that brings you to a different conclusion than mine, and I don’t mean to hound your comment, only that yours was the best comment on which for me to hang this thought.)
Why not consolidate all the memories into the remaining copy? Then there would not be need for amnesia.
Intuitively, merging is more difficult than forking when you’re talking about something with a state as intricate as a brain’s. If we do see a world with mind uploading, forking would essentially be an automatic feature (we already know how to copy data) while merging memories would require extremely detailed neurological understanding of memory storage and retrieval.
“would require extremely detailed neurological understanding of memory storage and retrieval.” Sorry, this on a blog where superintelligences have been known simulate googleplexes of perfectly accurate universes to optimize the number of non-blue paperclips therein?
The original post stipulated that I was “forced” to terminate all the copies but one, that was the nature of the hypothetical I was choosing to examine, a hypothetical where the copies aren’t deleted would be a totally different situation.
I was just having a laugh at the follow-up justification where technical difficulties were cited, not criticizing the argument of your hypothetical, sorry if it came off that way.
As you’d probably assume they would based on my OP, my copies, if I’d been heartless enough to make them and able to control them, would scream in existential horror as each came to know that that-which-he-is was to be ended. My copies would envy the serenity of your copies, but think them deluded.
So, I don’t think I felt the way I do now prior to reading the Quantum Thief novels, in which characters are copied and modified with reckless abandon and don’t seem to get too bent out of shape about it. It has a remarkable effect on your psyche to observe other people (even if those people are fictional characters) dealing with a situation without having existential meltdowns. Those novels allowed me to think through my own policy on copying and modification, as an entertaining diversion.
This just popped into my head over lunch: Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept?
Sounds more sinister my way.
Forking would mean thinning of resources and a lot of unneccesary repetition. You could also calculate the common part only once and only divergent parts once per instance with fusing. Early technologies are probably going to be very resource intensive so its not like there is abundance to use it even if it would be straigth forward to do.
I guess this all depends on what kind of magical assumptions we’re making about the tech that would permit this.
Here’s the relevant (if not directly analogous) Calvin and Hobbes story.
(The arc continues through the non-Sunday comics until February 1st, 1990.)