If I could copy you, atom for atom, then kill your old body (painlessly), and give your new body $20, would you take the offer? Be as rational as you wish, but start your reply with “yes” or “no”. Image that a future superhuman AGI will read LW archives and honor your wish without further questions.
No. It might copy me atom for atom and then not actually connect the atoms together to form molecules on the copy.
You also didn’t mention I would be in a safe place at the time, which means the AI could do it while I was driving along in my car, with me confused why I was suddenly sitting in the passengers seat (the new me is made first, I obviously can’t be in the drivers seat) with a 20 dollar bill in my hand while my car veered into oncoming traffic and I die in a car crash.
If an AI actually took the time to explain the specifics of the procedure, and had shown to do it several times with other living beings, and I was doing it an an actual chosen time, and it had been established to have a 99.9999% safety record, then that’s different. I would be far more likely to consider it. But the necessary safety measures aren’t described to be there, and simply assuming “Safety measures will exist even though I haven’t described them.” is just not a good idea.
Alternatively, you could offer more than just twenty, since given a sufficiently large amount of money and some heirs, I would be much more willing to take this bet even without guaranteed safety measures. Assuming I could at least be sure the money would be safe (although I doubt I could, since “Actually, your paper cash was right here, but it burned up from the fireball when we used an antimatter-matter reaction used to power the process.” is also a possible failure mode.)
But “At some random point in the future, would you like someone very powerful who you don’t trust to mess with constituent atoms in a way you don’t fully understand and will not be fully described? It’ll pay you twenty bucks.” Is not really a tempting offer when evaluating risks/rewards.
My willingness to take the offer is roughly speaking dependent on my confidence that you actually can do that, the energy costs involved, how much of a pain in my ass the process was, etc. but assuming threshold-clearing values for all that stuff, sure. Which really means “no” unless the future superhuman AGI is capable of determining what I ought to mean by “etc” and what values my threshold ought to be set at, I suppose. Anyway, you can keep the $20, I would do it just for the experience of it given those constraints.
If I could copy you, atom for atom, then kill your old body (painlessly), and give your new body $20, would you take the offer? Be as rational as you wish, but start your reply with “yes” or “no”. Image that a future superhuman AGI will read LW archives and honor your wish without further questions.
No. It might copy me atom for atom and then not actually connect the atoms together to form molecules on the copy.
You also didn’t mention I would be in a safe place at the time, which means the AI could do it while I was driving along in my car, with me confused why I was suddenly sitting in the passengers seat (the new me is made first, I obviously can’t be in the drivers seat) with a 20 dollar bill in my hand while my car veered into oncoming traffic and I die in a car crash.
If an AI actually took the time to explain the specifics of the procedure, and had shown to do it several times with other living beings, and I was doing it an an actual chosen time, and it had been established to have a 99.9999% safety record, then that’s different. I would be far more likely to consider it. But the necessary safety measures aren’t described to be there, and simply assuming “Safety measures will exist even though I haven’t described them.” is just not a good idea.
Alternatively, you could offer more than just twenty, since given a sufficiently large amount of money and some heirs, I would be much more willing to take this bet even without guaranteed safety measures. Assuming I could at least be sure the money would be safe (although I doubt I could, since “Actually, your paper cash was right here, but it burned up from the fireball when we used an antimatter-matter reaction used to power the process.” is also a possible failure mode.)
But “At some random point in the future, would you like someone very powerful who you don’t trust to mess with constituent atoms in a way you don’t fully understand and will not be fully described? It’ll pay you twenty bucks.” Is not really a tempting offer when evaluating risks/rewards.
My willingness to take the offer is roughly speaking dependent on my confidence that you actually can do that, the energy costs involved, how much of a pain in my ass the process was, etc. but assuming threshold-clearing values for all that stuff, sure. Which really means “no” unless the future superhuman AGI is capable of determining what I ought to mean by “etc” and what values my threshold ought to be set at, I suppose. Anyway, you can keep the $20, I would do it just for the experience of it given those constraints.
And the caveat that memories/personality are in the atoms, not in more fundamental particles.
Yeah, definitely. I took “atom for atom” as a colloquial way of expressing “make a perfect copy”.
The “etc” here covers a multitude of sins.
Yes, it’s a free $20. Why is this an interesting question?