“Do I choose between being forced to exist forever, or to die after less than 100 years of existence? Neither. I’d like to have the option to keep living for as long as I want.”
I didn’t mean being forced to exist forever, or pre-commiting to anything. I meant that I really do WANT to exist forever, yet I can’t see a way that it can work. That’s the dilemma that I mentioned: to die, ever, even after a gazillion years, feels horrible, because YOU will cease to exist, no matter after how much time. To never die feels just as horrible because I can’t see a way to remain sane after a very long time.
Who can guarantee that after x years you would feel satisfied and ready to die? I believe that as long as the brain remains healthy, it really doesn’t wanna die. And even if it could reach the state of accepting death, the current me just don’t ever wanna cease to exist. Doesn’t the idea of inevitably eventually ceasing to exist feel absolutely horrible to you?
Doesn’t the idea of inevitably eventually ceasing to exist feel absolutely horrible to you?
No. There is nothing I find inherently scary or unpleasant about nonexistence.
I’m just confused about the details of why that would happen. I mean, it would be sad if some future utopia didn’t have a better solution for insanity or for having too many memories, than nonexistence.
Insanity: Look at the algorithm of my mind and see how it’s malfunctioning? If nothing else works, revert my mindstate back a few months/years?
“No. There is nothing I find inherently scary or unpleasant about nonexistence.”
Would you agree that you’re perhaps a minority? That most people are scared/depressed about their own mortality?
“I’m just confused about the details of why that would happen. I mean, it would be sad if some future utopia didn’t have a better solution for insanity or for having too many memories, than nonexistence.
Insanity: Look at the algorithm of my mind and see how it’s malfunctioning? If nothing else works, revert my mindstate back a few months/years?
Memories: offload into long-term storage?”
On insanity, computationalism might be false. Consciousness might not be algorithmic. If it is, you’re right, it’s probably easy to deal with.
But I suspect that excess memories might always remain a problem. Is it really possible to off-load them while maintaining personal identity? That’s an open question in my view.
Specially when, like me, you don’t really buy into computationalism.
“Do I choose between being forced to exist forever, or to die after less than 100 years of existence? Neither. I’d like to have the option to keep living for as long as I want.”
I didn’t mean being forced to exist forever, or pre-commiting to anything. I meant that I really do WANT to exist forever, yet I can’t see a way that it can work. That’s the dilemma that I mentioned: to die, ever, even after a gazillion years, feels horrible, because YOU will cease to exist, no matter after how much time. To never die feels just as horrible because I can’t see a way to remain sane after a very long time.
Who can guarantee that after x years you would feel satisfied and ready to die? I believe that as long as the brain remains healthy, it really doesn’t wanna die. And even if it could reach the state of accepting death, the current me just don’t ever wanna cease to exist. Doesn’t the idea of inevitably eventually ceasing to exist feel absolutely horrible to you?
No. There is nothing I find inherently scary or unpleasant about nonexistence.
I’m just confused about the details of why that would happen. I mean, it would be sad if some future utopia didn’t have a better solution for insanity or for having too many memories, than nonexistence.
Insanity: Look at the algorithm of my mind and see how it’s malfunctioning? If nothing else works, revert my mindstate back a few months/years?
Memories: offload into long-term storage?
“No. There is nothing I find inherently scary or unpleasant about nonexistence.”
Would you agree that you’re perhaps a minority? That most people are scared/depressed about their own mortality?
“I’m just confused about the details of why that would happen. I mean, it would be sad if some future utopia didn’t have a better solution for insanity or for having too many memories, than nonexistence.
Insanity: Look at the algorithm of my mind and see how it’s malfunctioning? If nothing else works, revert my mindstate back a few months/years?
Memories: offload into long-term storage?”
On insanity, computationalism might be false. Consciousness might not be algorithmic. If it is, you’re right, it’s probably easy to deal with.
But I suspect that excess memories might always remain a problem. Is it really possible to off-load them while maintaining personal identity? That’s an open question in my view.
Specially when, like me, you don’t really buy into computationalism.