Intuitively, merging is more difficult than forking when you’re talking about something with a state as intricate as a brain’s. If we do see a world with mind uploading, forking would essentially be an automatic feature (we already know how to copy data) while merging memories would require extremely detailed neurological understanding of memory storage and retrieval.
“would require extremely detailed neurological understanding of memory storage and retrieval.”
Sorry, this on a blog where superintelligences have been known simulate googleplexes of perfectly accurate universes to optimize the number of non-blue paperclips therein?
The original post stipulated that I was “forced” to terminate all the copies but one, that was the nature of the hypothetical I was choosing to examine, a hypothetical where the copies aren’t deleted would be a totally different situation.
I was just having a laugh at the follow-up justification where technical difficulties were cited, not criticizing the argument of your hypothetical, sorry if it came off that way.
As you’d probably assume they would based on my OP, my copies, if I’d been heartless enough to make them and able to control them, would scream in existential horror as each came to know that that-which-he-is was to be ended. My copies would envy the serenity of your copies, but think them deluded.
So, I don’t think I felt the way I do now prior to reading the Quantum Thief novels, in which characters are copied and modified with reckless abandon and don’t seem to get too bent out of shape about it. It has a remarkable effect on your psyche to observe other people (even if those people are fictional characters) dealing with a situation without having existential meltdowns. Those novels allowed me to think through my own policy on copying and modification, as an entertaining diversion.
This just popped into my head over lunch: Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept?
Forking would mean thinning of resources and a lot of unneccesary repetition. You could also calculate the common part only once and only divergent parts once per instance with fusing. Early technologies are probably going to be very resource intensive so its not like there is abundance to use it even if it would be straigth forward to do.
Intuitively, merging is more difficult than forking when you’re talking about something with a state as intricate as a brain’s. If we do see a world with mind uploading, forking would essentially be an automatic feature (we already know how to copy data) while merging memories would require extremely detailed neurological understanding of memory storage and retrieval.
“would require extremely detailed neurological understanding of memory storage and retrieval.” Sorry, this on a blog where superintelligences have been known simulate googleplexes of perfectly accurate universes to optimize the number of non-blue paperclips therein?
The original post stipulated that I was “forced” to terminate all the copies but one, that was the nature of the hypothetical I was choosing to examine, a hypothetical where the copies aren’t deleted would be a totally different situation.
I was just having a laugh at the follow-up justification where technical difficulties were cited, not criticizing the argument of your hypothetical, sorry if it came off that way.
As you’d probably assume they would based on my OP, my copies, if I’d been heartless enough to make them and able to control them, would scream in existential horror as each came to know that that-which-he-is was to be ended. My copies would envy the serenity of your copies, but think them deluded.
So, I don’t think I felt the way I do now prior to reading the Quantum Thief novels, in which characters are copied and modified with reckless abandon and don’t seem to get too bent out of shape about it. It has a remarkable effect on your psyche to observe other people (even if those people are fictional characters) dealing with a situation without having existential meltdowns. Those novels allowed me to think through my own policy on copying and modification, as an entertaining diversion.
This just popped into my head over lunch: Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept?
Sounds more sinister my way.
Forking would mean thinning of resources and a lot of unneccesary repetition. You could also calculate the common part only once and only divergent parts once per instance with fusing. Early technologies are probably going to be very resource intensive so its not like there is abundance to use it even if it would be straigth forward to do.
I guess this all depends on what kind of magical assumptions we’re making about the tech that would permit this.