What’s the point of uploading if we have an AI with all the skills and knowledge of everyone not information-theoretically dead at the time of its creation?
I have no idea how to argue with the ideas about consciousness/identity/experience/whatever that make uploading seem like it could qualify as avoiding death. It occurs to me, though, that those same ideas sorta make uploading individuals pointless. If strong AI doesn’t happen, why not just upload the most useful bits of people’s brainstates and work out how to combine them into some collective that is not one person, but has the knowledge and skills? Why install, say, Yudkowsky.wbe and Immortalbob.wbe, when you could install them as patches to grandfather_of_all_knowledge.mbe, effectively getting (Yudkowsky|Immortalbob)?
Assume that any properties of a brain that can be quantified get quantified, and where the variables match up, the or process takes the maximum. So if brain A is better at math than brain B, brain A|B uses A’s math ability. If brain M is male and brain F is female, though, brain M|F will learn from both perspectives (and hopefully be stronger than the originals for it).
So the benefits of a group, with none of the drawbacks, all crammed into one metabrain.
Because I want to be alive. I don’t just want humanity to have the benefit of my skills and knowledge.
When I read this in the recent comments list, I at first thought it was a position against uploading. Then I read the other recent comments and realized it was probably a reply to me.
I get the impression that no one has a functional definition of what continuity of identity means, yet destructive copies (uploads, teleports, etc) appear to be overwhelmingly considered as preserving it at least as much as sleep. I find this confusing, but the only argument that seemed to support it that I’ve found is Eliezer’s “Identity is not in individual atoms”, which is a bit disingenuous, in that uploads are almost certainly not going to be precise quantum state replicators.
(I’d make a pole, here, but my last attempt went poorly and it doesn’t appear to be standard markup, so I don’t know where I’d test it.)
What probability would you assign to each of these as continuing personal identity?
Sleep.
puberty
The typical human experience over 1-5 years.
Gradual replacement of biological brain matter with artificial substitutes.
Brain-state copying (uploads, teleportation)
Brain-state melding (Brain Omega = Brain A | Brain B | Brain n )
1) 1.0
2) 1.0
3) 1.0
4) It depends on the artificial substitutes :) If they faithfully replicate brain function (whatever that means), 1.0
5) Again, if the process is faithful, 1.0
6) It really depends. For example, if you drop all my memories, 0.0. If you keep an electronic copy of my brain on the same network as several other brains, 1.0. in-between: in-between
(Yes, I know 1.0 probabilities are silly. I don’t have enough sig-figs of accuracy for the true value :)
I don’t think most people who believe uploading qualifies as avoiding death would also agree that adding a fraction of a person’s brainstate to an overmind would also qualify as avoiding death.
What’s the point of uploading if we have an AI with all the skills and knowledge of everyone not information-theoretically dead at the time of its creation?
I have no idea how to argue with the ideas about consciousness/identity/experience/whatever that make uploading seem like it could qualify as avoiding death. It occurs to me, though, that those same ideas sorta make uploading individuals pointless. If strong AI doesn’t happen, why not just upload the most useful bits of people’s brainstates and work out how to combine them into some collective that is not one person, but has the knowledge and skills? Why install, say, Yudkowsky.wbe and Immortalbob.wbe, when you could install them as patches to grandfather_of_all_knowledge.mbe, effectively getting (Yudkowsky|Immortalbob)?
Assume that any properties of a brain that can be quantified get quantified, and where the variables match up, the or process takes the maximum. So if brain A is better at math than brain B, brain A|B uses A’s math ability. If brain M is male and brain F is female, though, brain M|F will learn from both perspectives (and hopefully be stronger than the originals for it).
So the benefits of a group, with none of the drawbacks, all crammed into one metabrain.
Because I want to be alive. I don’t just want humanity to have the benefit of my skills and knowledge.
When I read this in the recent comments list, I at first thought it was a position against uploading. Then I read the other recent comments and realized it was probably a reply to me.
I get the impression that no one has a functional definition of what continuity of identity means, yet destructive copies (uploads, teleports, etc) appear to be overwhelmingly considered as preserving it at least as much as sleep. I find this confusing, but the only argument that seemed to support it that I’ve found is Eliezer’s “Identity is not in individual atoms”, which is a bit disingenuous, in that uploads are almost certainly not going to be precise quantum state replicators.
(I’d make a pole, here, but my last attempt went poorly and it doesn’t appear to be standard markup, so I don’t know where I’d test it.)
What probability would you assign to each of these as continuing personal identity?
Sleep.
puberty
The typical human experience over 1-5 years.
Gradual replacement of biological brain matter with artificial substitutes.
Brain-state copying (uploads, teleportation)
Brain-state melding (Brain Omega = Brain A | Brain B | Brain n )
1) 1.0 2) 1.0 3) 1.0 4) It depends on the artificial substitutes :) If they faithfully replicate brain function (whatever that means), 1.0 5) Again, if the process is faithful, 1.0 6) It really depends. For example, if you drop all my memories, 0.0. If you keep an electronic copy of my brain on the same network as several other brains, 1.0. in-between: in-between
(Yes, I know 1.0 probabilities are silly. I don’t have enough sig-figs of accuracy for the true value :)
I don’t think most people who believe uploading qualifies as avoiding death would also agree that adding a fraction of a person’s brainstate to an overmind would also qualify as avoiding death.