I.. do not think there is any scenario in which allowing unrestricted copying of minds does not end in the apocalypse. It is an open invitation for a paperclipped solar system, only instead of paperclips, all available mass has been turned into instances of Sam Clado from alaska who just really likes himself. It sets off my “BAD IDEA” detectors really, really hard. And then, when I stop and consider it on a less reflexive level? It seems worse.
Backups—that is, copies in a frozen/not running state, could be permitted, but giving anyone permission to just manufacture more selves? No. Hell No. If the government wants to outlaw that on pain of pain, I am in favor.
giving anyone permission to just manufacture more selves? No. Hell No.
This seems like a 1000× faster version of one ethnic group reproducing faster than another ethnic group in their neighborhood. And the slow version already makes people kill each other.
Analogically, how about social state? Are we going to guarantee at least some minimum human rights to ems? Because if we do, and if someone is happy to live at the minimum level, what’s going to stop them from making as much copies as possible, and letting the government pay or collapse? (Or perhaps illegal copies don’t have the same right? Now we have slavery.)
I don’t disagree with your analysis. What I’m pointing out is that I haven’t seen any workable proposals to prevent this (or another very disagreeable scenario), except for 1) a singleton AI controlling the effective laws of physics in its light cone, or 2) somehow making sure nobody but a single player (“ruler”) has the ability to create computer hardware and/or software. In which case, the universe will probably be tiled with that ruler. The incentives to create copies and the resulting evolutionary pressures are too great.
And in the context of all this, ideas like democracy are completely unworkable unless one of these restrictions is implemented.
How does all available mass get turned into Sam Clado, unless there is some physical replicator? And if there’s a physical replicator, is it really all that important whether it’s replicating to create hardware for Sam Clado, or replicating just to replicate?
Sam’s the one who ordered it to replicate without bound. Others may have different ideas of how much ought to be mined, so it’s not a given that that is how things will end up.
Any world where I cannot make and interact with copies of myself and/or custom minds forever, is a hell as far as I’m concerned. It being as slow and dangerous as it is currently is bad enough.
I.. do not think there is any scenario in which allowing unrestricted copying of minds does not end in the apocalypse. It is an open invitation for a paperclipped solar system, only instead of paperclips, all available mass has been turned into instances of Sam Clado from alaska who just really likes himself. It sets off my “BAD IDEA” detectors really, really hard. And then, when I stop and consider it on a less reflexive level? It seems worse. Backups—that is, copies in a frozen/not running state, could be permitted, but giving anyone permission to just manufacture more selves? No. Hell No. If the government wants to outlaw that on pain of pain, I am in favor.
This seems like a 1000× faster version of one ethnic group reproducing faster than another ethnic group in their neighborhood. And the slow version already makes people kill each other.
Analogically, how about social state? Are we going to guarantee at least some minimum human rights to ems? Because if we do, and if someone is happy to live at the minimum level, what’s going to stop them from making as much copies as possible, and letting the government pay or collapse? (Or perhaps illegal copies don’t have the same right? Now we have slavery.)
I don’t disagree with your analysis. What I’m pointing out is that I haven’t seen any workable proposals to prevent this (or another very disagreeable scenario), except for 1) a singleton AI controlling the effective laws of physics in its light cone, or 2) somehow making sure nobody but a single player (“ruler”) has the ability to create computer hardware and/or software. In which case, the universe will probably be tiled with that ruler. The incentives to create copies and the resulting evolutionary pressures are too great.
And in the context of all this, ideas like democracy are completely unworkable unless one of these restrictions is implemented.
I’m all for (1). I’ve yet to see a plausible scenario not involving a singleton AI that isn’t, on some level, horrifying.
How does all available mass get turned into Sam Clado, unless there is some physical replicator? And if there’s a physical replicator, is it really all that important whether it’s replicating to create hardware for Sam Clado, or replicating just to replicate?
Sam’s the one who ordered it to replicate without bound. Others may have different ideas of how much ought to be mined, so it’s not a given that that is how things will end up.
Any world where I cannot make and interact with copies of myself and/or custom minds forever, is a hell as far as I’m concerned. It being as slow and dangerous as it is currently is bad enough.
I agree with your estimation. But there may be equilibriums that don’t end so badly, and are more implementable than total restriction...
Preventing the copying of minds strikes me as a bad idea. You can only make seven billion people so happy.
Any idea on how to figure out how many copies to allow?