If someone cloned my body atom for atom, “I” feel like it wouldn’t really be me, just an illusion fooling outside observers. Same for mind uploads.
Do any other atheists feel the same way?
Yes, many do. A part of me does. However I’m pretty sure that part of me is wrong (i.e. falling for an intuitive trap) because it doesn’t make sense with my other, more powerful intuitions of identity.
For example, there is the manner in which I anticipate my decisions today impacting my actions tomorrow. This feels identity-critical, yet the effect they have would not be any different on a materially continuous future self than on a cloned or simulated future self.
As to cryonics, that’s obviously not quite the same a mind upload, but it feels like a greyish area, if the original cells are destroyed.
The cells might be repaired instead of being destroyed and replaced. It depends on what is ultimately feasible / comes soonest in the tech tree. Many cryonicists have expressed a preference for this, some saying that uploading has equal value to death for them.
Also if we reach the point of perfect brain preservation in your lifetime it could be implanted into a cloned body (perhaps a patchwork of printed organs) without requiring repairs. This would be the least death-like version of cryonics short of actually keeping the entire body from experiencing damage.
Note that some cell loss and replacement is going on already in the ordinary course of biology. Presumably one of the future enhancements available would be to make your brain more solid-state so that you wouldn’t be “dying and getting replaced” every few months.
Another thing: if my world is just a simulation (even the NYT wrote about this theory), which I have no way of knowing, then cloning myself and killing the original is still suicide, with a very negative utility.
I’m not sure I follow. If the world is a simulation, there are probably all kinds of copy-paste relationships between your past and future self-moments, this would just be one more to add to the pile.
However it is a good point that if you believe your identity is conserved in the original, and you want to survive and don’t value the clone’s life above your own, you should precommit not to kill the original if you should ever happen to wake up as the clone (you should kill yourself as the clone instead if it comes up as an either/or option).
But at the same time as you are anticipating this decision, you would be rejecting the notion that the clone is going to be really you, and the clone would also reject that it is really you.
Yes, many do. A part of me does. However I’m pretty sure that part of me is wrong (i.e. falling for an intuitive trap) because it doesn’t make sense with my other, more powerful intuitions of identity.
For example, there is the manner in which I anticipate my decisions today impacting my actions tomorrow. This feels identity-critical, yet the effect they have would not be any different on a materially continuous future self than on a cloned or simulated future self.
The cells might be repaired instead of being destroyed and replaced. It depends on what is ultimately feasible / comes soonest in the tech tree. Many cryonicists have expressed a preference for this, some saying that uploading has equal value to death for them.
Also if we reach the point of perfect brain preservation in your lifetime it could be implanted into a cloned body (perhaps a patchwork of printed organs) without requiring repairs. This would be the least death-like version of cryonics short of actually keeping the entire body from experiencing damage.
Note that some cell loss and replacement is going on already in the ordinary course of biology. Presumably one of the future enhancements available would be to make your brain more solid-state so that you wouldn’t be “dying and getting replaced” every few months.
I’m not sure I follow. If the world is a simulation, there are probably all kinds of copy-paste relationships between your past and future self-moments, this would just be one more to add to the pile.
However it is a good point that if you believe your identity is conserved in the original, and you want to survive and don’t value the clone’s life above your own, you should precommit not to kill the original if you should ever happen to wake up as the clone (you should kill yourself as the clone instead if it comes up as an either/or option).
But at the same time as you are anticipating this decision, you would be rejecting the notion that the clone is going to be really you, and the clone would also reject that it is really you.