I’m an atheist, and believe that my mind can be seen as simply “software” running on my brain. However that “software” also believes that “I” is not just the software, but the brain and perhaps even the rest of the body.
If someone cloned my body atom for atom, “I” feel like it wouldn’t really be me, just an illusion fooling outside observers. Same for mind uploads.
Do any other atheists feel the same way?
As to cryonics, that’s obviously not quite the same a mind upload, but it feels like a greyish area, if the original cells are destroyed.
Another thing: if my world is just a simulation (even the NYT wrote about this theory), which I have no way of knowing, then cloning myself and killing the original is still suicide, with a very negative utility.
What do others think? I know that Kurzweil can’t wait to upload his mind, and Goertzel wants multiple copies of himself to hedge his bets.
I reserve some uncertainty that I’m fundamentally wrong about how physics and anthropics work, so I’d treat the question the same as “how much money would I have to pay you to accept a ~10% chance of instant painless death”?
In other words, $20 definitely ain’t going to cut it, but a hundred million would. (I have a hard time estimating exactly where the cutoff point would be.)
“Look at any photograph or work of art. If you could duplicate exactly the first tiny dot of color, and then the next and the next, you would end with a perfect copy of the whole, indistinguishable from the original in every way, including the so-called ‘moral value’ of the art itself. Nothing can transcend its smallest elements”—CEO Nwabudike Morgan, “The Ethics of Greed”, Sid Meier’s Alpha Centauri
He never did, but I plan to post a whole slew of Alpha Centauri quotes in next month’s thread. There are so many good ones. I just started playing it again.
I briefly thought that way, thought about it more, and realized that view of identity was incoherent. There are lots of thought experiments you can pose in which that picture of identity produces ridiculous, up-physical, or contradictory results. I decided it was simpler to decide that it was the information, and not the meat, that dictated who I was. So long as the computational function being enacted is equivalent, it’s me. Period. No if’s, and’s, or but’s.
If someone cloned my body atom for atom, “I” feel like it wouldn’t really be me, just an illusion fooling outside observers. Same for mind uploads.
Do any other atheists feel the same way?
Yes, many do. A part of me does. However I’m pretty sure that part of me is wrong (i.e. falling for an intuitive trap) because it doesn’t make sense with my other, more powerful intuitions of identity.
For example, there is the manner in which I anticipate my decisions today impacting my actions tomorrow. This feels identity-critical, yet the effect they have would not be any different on a materially continuous future self than on a cloned or simulated future self.
As to cryonics, that’s obviously not quite the same a mind upload, but it feels like a greyish area, if the original cells are destroyed.
The cells might be repaired instead of being destroyed and replaced. It depends on what is ultimately feasible / comes soonest in the tech tree. Many cryonicists have expressed a preference for this, some saying that uploading has equal value to death for them.
Also if we reach the point of perfect brain preservation in your lifetime it could be implanted into a cloned body (perhaps a patchwork of printed organs) without requiring repairs. This would be the least death-like version of cryonics short of actually keeping the entire body from experiencing damage.
Note that some cell loss and replacement is going on already in the ordinary course of biology. Presumably one of the future enhancements available would be to make your brain more solid-state so that you wouldn’t be “dying and getting replaced” every few months.
Another thing: if my world is just a simulation (even the NYT wrote about this theory), which I have no way of knowing, then cloning myself and killing the original is still suicide, with a very negative utility.
I’m not sure I follow. If the world is a simulation, there are probably all kinds of copy-paste relationships between your past and future self-moments, this would just be one more to add to the pile.
However it is a good point that if you believe your identity is conserved in the original, and you want to survive and don’t value the clone’s life above your own, you should precommit not to kill the original if you should ever happen to wake up as the clone (you should kill yourself as the clone instead if it comes up as an either/or option).
But at the same time as you are anticipating this decision, you would be rejecting the notion that the clone is going to be really you, and the clone would also reject that it is really you.
I’m an atheist, and believe that my mind can be seen as simply “software” running on my brain. However that “software” also believes that “I” is not just the software, but the brain and perhaps even the rest of the body.
And also, among other things, software outside your brain (including in other brains). My brain might be the main part of my identity, but not its only part: other parts of it are in the rest of my body, in other people’s brain, in my wardrobe, in my laptop’s hard disk, in Facebook’s server, in my university’s database, in my wallet, etc. etc. etc.
I’m an atheist, and believe that my mind can be seen as simply “software” running on my brain. However that “software” also believes that “I” is not just the software, but the brain and perhaps even the rest of the body.
If someone cloned my body atom for atom, “I” feel like it wouldn’t really be me, just an illusion fooling outside observers. Same for mind uploads.
Do any other atheists feel the same way?
As to cryonics, that’s obviously not quite the same a mind upload, but it feels like a greyish area, if the original cells are destroyed.
Another thing: if my world is just a simulation (even the NYT wrote about this theory), which I have no way of knowing, then cloning myself and killing the original is still suicide, with a very negative utility.
What do others think? I know that Kurzweil can’t wait to upload his mind, and Goertzel wants multiple copies of himself to hedge his bets.
It would be you as much as you are you of a second ago.
I had a hidden ugh-field about that one. It took quite a few repetitions of the Litany of Gendlin to grok it.
So, if I could copy you, then kill your old body (painlessly) and give the new body $20, would you take the offer? yes/no?
I reserve some uncertainty that I’m fundamentally wrong about how physics and anthropics work, so I’d treat the question the same as “how much money would I have to pay you to accept a ~10% chance of instant painless death”?
In other words, $20 definitely ain’t going to cut it, but a hundred million would. (I have a hard time estimating exactly where the cutoff point would be.)
I’m confident, but not that confident.
Depends. Is the new body going to be here or ‘up’ in the ship. And is there a beam of some kind involved?
“Look at any photograph or work of art. If you could duplicate exactly the first tiny dot of color, and then the next and the next, you would end with a perfect copy of the whole, indistinguishable from the original in every way, including the so-called ‘moral value’ of the art itself. Nothing can transcend its smallest elements”—CEO Nwabudike Morgan, “The Ethics of Greed”, Sid Meier’s Alpha Centauri
If that hasn’t yet been brought up as a quote in a rational quote thread it really should be.
He never did, but I plan to post a whole slew of Alpha Centauri quotes in next month’s thread. There are so many good ones. I just started playing it again.
I briefly thought that way, thought about it more, and realized that view of identity was incoherent. There are lots of thought experiments you can pose in which that picture of identity produces ridiculous, up-physical, or contradictory results. I decided it was simpler to decide that it was the information, and not the meat, that dictated who I was. So long as the computational function being enacted is equivalent, it’s me. Period. No if’s, and’s, or but’s.
Yes, many do. A part of me does. However I’m pretty sure that part of me is wrong (i.e. falling for an intuitive trap) because it doesn’t make sense with my other, more powerful intuitions of identity.
For example, there is the manner in which I anticipate my decisions today impacting my actions tomorrow. This feels identity-critical, yet the effect they have would not be any different on a materially continuous future self than on a cloned or simulated future self.
The cells might be repaired instead of being destroyed and replaced. It depends on what is ultimately feasible / comes soonest in the tech tree. Many cryonicists have expressed a preference for this, some saying that uploading has equal value to death for them.
Also if we reach the point of perfect brain preservation in your lifetime it could be implanted into a cloned body (perhaps a patchwork of printed organs) without requiring repairs. This would be the least death-like version of cryonics short of actually keeping the entire body from experiencing damage.
Note that some cell loss and replacement is going on already in the ordinary course of biology. Presumably one of the future enhancements available would be to make your brain more solid-state so that you wouldn’t be “dying and getting replaced” every few months.
I’m not sure I follow. If the world is a simulation, there are probably all kinds of copy-paste relationships between your past and future self-moments, this would just be one more to add to the pile.
However it is a good point that if you believe your identity is conserved in the original, and you want to survive and don’t value the clone’s life above your own, you should precommit not to kill the original if you should ever happen to wake up as the clone (you should kill yourself as the clone instead if it comes up as an either/or option).
But at the same time as you are anticipating this decision, you would be rejecting the notion that the clone is going to be really you, and the clone would also reject that it is really you.
And also, among other things, software outside your brain (including in other brains). My brain might be the main part of my identity, but not its only part: other parts of it are in the rest of my body, in other people’s brain, in my wardrobe, in my laptop’s hard disk, in Facebook’s server, in my university’s database, in my wallet, etc. etc. etc.
Many of those things cryonics couldn’t preserve.