Eliezer...the main issue that keeps me from cryonics is not whether the “real me” wakes up on the other side. Most smart people would agree that this is a non-issue, a silly question arising from the illusion of mind-body duality.
The first question is about how accurate the reconstruction will be. When you wipe a hard drive with a magnet, you can recover some of the content, but usually not all of it. Recovering “some” of a human, but not all of it, could easily create a mentally handicapped, broken consciousness.
Setting that aside, there is an second problem. If and when immortality and AI are achieved, what value would my revived consciousness contribute to such a society?
You’ve thus far understood that death isn’t a bad thing when a copy of the information is preserved and later revived. You’ve explained that you are willing to treat consciousness much like you would a computer file—you’ve explained that you would be willing to destroy one of two redundant duplicates of yourself.
Tell me, why exactly is it okay to destroy a redundant duplicate of yourself? You can’t say that it’s okay to destroy it simply because it is redundant, because that also destroys the point of cryonics. There will be countless humans and AIs that will come into existence, and each of those minds will require resources to maintain. Why is it so important that your, or my, consciousness be one among this swarm? Isn’t that...well...redundant?
For the same reasons that you would be willing to destroy one of two identical copies, you should be willing to destroy all the copies given that the software—the consciousness—that runs within is not exceptional among all the possible consciousnesses that those resources could be devoted to.
Eliezer...the main issue that keeps me from cryonics is not whether the “real me” wakes up on the other side. Most smart people would agree that this is a non-issue, a silly question arising from the illusion of mind-body duality.
The first question is about how accurate the reconstruction will be. When you wipe a hard drive with a magnet, you can recover some of the content, but usually not all of it. Recovering “some” of a human, but not all of it, could easily create a mentally handicapped, broken consciousness.
Setting that aside, there is an second problem. If and when immortality and AI are achieved, what value would my revived consciousness contribute to such a society?
You’ve thus far understood that death isn’t a bad thing when a copy of the information is preserved and later revived. You’ve explained that you are willing to treat consciousness much like you would a computer file—you’ve explained that you would be willing to destroy one of two redundant duplicates of yourself.
Tell me, why exactly is it okay to destroy a redundant duplicate of yourself? You can’t say that it’s okay to destroy it simply because it is redundant, because that also destroys the point of cryonics. There will be countless humans and AIs that will come into existence, and each of those minds will require resources to maintain. Why is it so important that your, or my, consciousness be one among this swarm? Isn’t that...well...redundant?
For the same reasons that you would be willing to destroy one of two identical copies, you should be willing to destroy all the copies given that the software—the consciousness—that runs within is not exceptional among all the possible consciousnesses that those resources could be devoted to.