“No. There is nothing I find inherently scary or unpleasant about nonexistence.”
Would you agree that you’re perhaps a minority? That most people are scared/depressed about their own mortality?
“I’m just confused about the details of why that would happen. I mean, it would be sad if some future utopia didn’t have a better solution for insanity or for having too many memories, than nonexistence.
Insanity: Look at the algorithm of my mind and see how it’s malfunctioning? If nothing else works, revert my mindstate back a few months/years?
Memories: offload into long-term storage?”
On insanity, computationalism might be false. Consciousness might not be algorithmic. If it is, you’re right, it’s probably easy to deal with.
But I suspect that excess memories might always remain a problem. Is it really possible to off-load them while maintaining personal identity? That’s an open question in my view.
Specially when, like me, you don’t really buy into computationalism.
“No. There is nothing I find inherently scary or unpleasant about nonexistence.”
Would you agree that you’re perhaps a minority? That most people are scared/depressed about their own mortality?
“I’m just confused about the details of why that would happen. I mean, it would be sad if some future utopia didn’t have a better solution for insanity or for having too many memories, than nonexistence.
Insanity: Look at the algorithm of my mind and see how it’s malfunctioning? If nothing else works, revert my mindstate back a few months/years?
Memories: offload into long-term storage?”
On insanity, computationalism might be false. Consciousness might not be algorithmic. If it is, you’re right, it’s probably easy to deal with.
But I suspect that excess memories might always remain a problem. Is it really possible to off-load them while maintaining personal identity? That’s an open question in my view.
Specially when, like me, you don’t really buy into computationalism.