The thing is, I’m just not sure if it’s even a reasonable thing to talk about ‘immortality’ because I don’t know what it means for one personal identity (‘soul’) to persist. I couldn’t be sure if a computer simulated my mind it would be ‘me’, for example. Immortality will likely involve serious changes to the physical form our mind takes, and once you start talking about that you get into the realm of thought experiments like the idea that if you put someone under a general anaesthetic, take out one atom from their brain, then wake them up, you have a similar person but not the one who originally went under the anaesthetic. So from the perspective of the original person, undergoing their operation was pointless, because they are dead anyway. The person who wakes from the operation is someone else entirely.
I guess I’m just trying to say that immortality makes heaps of sense if we can somehow solve the question of personal identity, but if we can’t, then ‘immortality’ may be pretty nonsensical to talk about, simply because if we cannot say what it takes for a single ‘soul’ to persist over time, the very concept of ‘immortality’ may be ill-defined.
I like your post about the heat death of the universe, if you ever figure anything out regarding the persistence of a personal identity, I’d like you to message me or something.
Isn’t it purely a matter of definition? You can say that a version of you with one atom of yourself is you or that it isn’t; or that a simulation of you either is or isn’t you; but there’s no objective right answer. It is worth nothing, though, that if you don’t tell the different-by-one-atom version, or the simulated version, of the fact, they would probably never question being you.
If there’s no objective right answer, then what does it mean to seek immortality? For example, if we found out that a simulation of ‘you’ is not actually ‘you’, would seeking immortality mean we can’t upload our minds to machines and have to somehow figure out a way to keep the pink fleshy stuff that is our current brains around?
If we found out that there’s a new ‘you’ every time you go to sleep and wake up, wouldn’t it make sense to abandon the quest for immortality as we already die every night?
(Note, I don’t actually think this happens. But I think the concept of personal identity is inextricably linked to the question of how separate consciousnesses, each feeling their own qualia, can arise.)
If there’s no objective right answer, you can just decide for yourself. If you want immortality and decide that a simulation of ‘you’ is not actually ‘you’, I guess you (‘you’?) will indeed need to find a way to extend your biological life. If you’re happy with just the simulation existing, then maybe brain uploading or FAI is the way to go. But we’re not going to “find out” the right answer to those questions if there is no right answer.
But I think the concept of personal identity is inextricably linked to the question of how separate consciousnesses, each feeling their own qualia, can arise.
Are you talking about the hard problem of consciousness? I’m mostly with Daniel Dennett here and think that the hard problem probably doesn’t actually exist (but I wouldn’t say that I’m absolutely certain about this), but if you think that the hard problem needs to be solved, then I guess this identity business also becomes somewhat more problematic.
I think consciousness arises from physical processes (as Denett says), but that’s not really solving the problem or proving it doesn’t exist.
Anyway, I think you are right in that if you think being mind-uploaded does or does not constitute continuing your personal identity or whatever, it’s hard to say you are wrong. However, what if I don’t actually know if it does, yet I want to be immortal? Then we have to study that to figure out what things we can do keep the real ‘us’ existing and what don’t.
What if the persistence of personal identity is a meaningless pursuit?
Let’s suppose that the contents of a brain are uploaded to a computer, or that a person is anesthesized and a single atom in their brain is replaced. What exactly would it mean to say that personal identity doesn’t persist in such situations?
So, let’s say you die, but a super intelligence reconstructs your brain (using new atoms, but almost exactly to specification), but misplaces a couple of atoms. Is that ‘you’?
If it is, let’s say the computer then realises what it did wrong and reconstructs your brain again (leaving its first prototype intact), this time exactly. Which one is ‘you’?
Let’s say the second one is ‘you’, and the first one isn’t. What happens when the computer reconstructs yet another exact copy of your brain?
If the computer told you it was going to torture the slightly-wrong copy of you (the one with a few atoms missing), would that scare you?
What if it was going to torture the exact copy of you, but only one of the exact copies? There’s a version of you not being tortured, what’s to say that won’t be the real ‘you’?
Wouldn’t there, then, be some copies of me not being tortured and one that is being tortured?
If I copied your brain right now, but left you alive, and tortured the copy, you would not feel any pain (I assume). I could even torture it secretly and you would be none the wiser.
So go back to the scenario—you’re killed, there are some exact copies made of your brain and some inexact copies. It has been shown that it is possible to torture an exact copy of your brain while not torturing ‘you’, so surely you could torture one or all of these reconstructed brains and you would have no reason to fear?
If I copied your brain right now, but left you alive, and tortured the copy, you would not feel any pain (I assume). I could even torture it secretly and you would be none the wiser.
Well.. Let’s say I make a copy of you at time t. I can also make them forget which one is which. Then, at time t + 1, I will tickle the copy a lot. After that, I go back in time to t − 1, tell you of my intentions and ask you if you expect to get tickled. What do you reply?
Does it make any sense to you to say that you expect to experience both being and not being tickled?
The thing is, I’m just not sure if it’s even a reasonable thing to talk about ‘immortality’ because I don’t know what it means for one personal identity (‘soul’) to persist. I couldn’t be sure if a computer simulated my mind it would be ‘me’, for example. Immortality will likely involve serious changes to the physical form our mind takes, and once you start talking about that you get into the realm of thought experiments like the idea that if you put someone under a general anaesthetic, take out one atom from their brain, then wake them up, you have a similar person but not the one who originally went under the anaesthetic. So from the perspective of the original person, undergoing their operation was pointless, because they are dead anyway. The person who wakes from the operation is someone else entirely.
I guess I’m just trying to say that immortality makes heaps of sense if we can somehow solve the question of personal identity, but if we can’t, then ‘immortality’ may be pretty nonsensical to talk about, simply because if we cannot say what it takes for a single ‘soul’ to persist over time, the very concept of ‘immortality’ may be ill-defined.
I like your post about the heat death of the universe, if you ever figure anything out regarding the persistence of a personal identity, I’d like you to message me or something.
Isn’t it purely a matter of definition? You can say that a version of you with one atom of yourself is you or that it isn’t; or that a simulation of you either is or isn’t you; but there’s no objective right answer. It is worth nothing, though, that if you don’t tell the different-by-one-atom version, or the simulated version, of the fact, they would probably never question being you.
If there’s no objective right answer, then what does it mean to seek immortality? For example, if we found out that a simulation of ‘you’ is not actually ‘you’, would seeking immortality mean we can’t upload our minds to machines and have to somehow figure out a way to keep the pink fleshy stuff that is our current brains around?
If we found out that there’s a new ‘you’ every time you go to sleep and wake up, wouldn’t it make sense to abandon the quest for immortality as we already die every night?
(Note, I don’t actually think this happens. But I think the concept of personal identity is inextricably linked to the question of how separate consciousnesses, each feeling their own qualia, can arise.)
If there’s no objective right answer, you can just decide for yourself. If you want immortality and decide that a simulation of ‘you’ is not actually ‘you’, I guess you (‘you’?) will indeed need to find a way to extend your biological life. If you’re happy with just the simulation existing, then maybe brain uploading or FAI is the way to go. But we’re not going to “find out” the right answer to those questions if there is no right answer.
Are you talking about the hard problem of consciousness? I’m mostly with Daniel Dennett here and think that the hard problem probably doesn’t actually exist (but I wouldn’t say that I’m absolutely certain about this), but if you think that the hard problem needs to be solved, then I guess this identity business also becomes somewhat more problematic.
I think consciousness arises from physical processes (as Denett says), but that’s not really solving the problem or proving it doesn’t exist.
Anyway, I think you are right in that if you think being mind-uploaded does or does not constitute continuing your personal identity or whatever, it’s hard to say you are wrong. However, what if I don’t actually know if it does, yet I want to be immortal? Then we have to study that to figure out what things we can do keep the real ‘us’ existing and what don’t.
What if the persistence of personal identity is a meaningless pursuit?
Let’s suppose that the contents of a brain are uploaded to a computer, or that a person is anesthesized and a single atom in their brain is replaced. What exactly would it mean to say that personal identity doesn’t persist in such situations?
So, let’s say you die, but a super intelligence reconstructs your brain (using new atoms, but almost exactly to specification), but misplaces a couple of atoms. Is that ‘you’?
If it is, let’s say the computer then realises what it did wrong and reconstructs your brain again (leaving its first prototype intact), this time exactly. Which one is ‘you’?
Let’s say the second one is ‘you’, and the first one isn’t. What happens when the computer reconstructs yet another exact copy of your brain?
If the computer told you it was going to torture the slightly-wrong copy of you (the one with a few atoms missing), would that scare you?
What if it was going to torture the exact copy of you, but only one of the exact copies? There’s a version of you not being tortured, what’s to say that won’t be the real ‘you’?
Maybe; it would probably think so, at least if it wasn’t told otherwise.
Both would probably think so.
All three might think so.
I find that a bit scary.
Wouldn’t there, then, be some copies of me not being tortured and one that is being tortured?
If I copied your brain right now, but left you alive, and tortured the copy, you would not feel any pain (I assume). I could even torture it secretly and you would be none the wiser.
So go back to the scenario—you’re killed, there are some exact copies made of your brain and some inexact copies. It has been shown that it is possible to torture an exact copy of your brain while not torturing ‘you’, so surely you could torture one or all of these reconstructed brains and you would have no reason to fear?
Well.. Let’s say I make a copy of you at time t. I can also make them forget which one is which. Then, at time t + 1, I will tickle the copy a lot. After that, I go back in time to t − 1, tell you of my intentions and ask you if you expect to get tickled. What do you reply?
Does it make any sense to you to say that you expect to experience both being and not being tickled?