I don’t think it is any more horryfing then being stuck in one reality, treasuring memories. It is certainly less horrifying then our current human existence with prospects of death, suffering, boredom, heartache, etc.
Your fear seems to just be about something different than you’re used to.
Let’s take a step back, and ask ourselves what’s really going on here. It’s an interesting idea, for which I thank you; I might use it in a story. But...
By living your life in this way, you’d be divorcing yourself from reality. There is a real world, and if you’re interacting solely with these artificial worlds you’re not interacting with it. That’s what sets off my “no way, no how” alert, in part because it seems remarkably dangerous; anything might happen, your computing infrastructure might get stolen from underneath you, and you wouldn’t necessarily know.
Disclaimer: This comment may sound very crackpottish. I promise the ideas in it aren’t as wonky as they seem, but it would be to hard to explain them properly in such short time.
By living your life in this way, you’d be divorcing yourself from reality.
Here comes the notion that in posthumanism there is no definite reality. Reality is a product of experiences and how your choices influence those experiences. In posthumanism however you can modify it freely. What we call reality is a very local phenomenon.
Anyhow, it’s not the case that your computing infrastructure would be in danger—it would be either protected by some powerful AI, much better suited to protecting your infrastructure then you or there would be other copies of you keeping the maintenance in “meatspace” (Again, I strongly believe that it’s only our contemporary perspective that makes us feel that reality in which computations are performed is more real then virtual reality).
What’s more, a Waker can be perfectly aware that there is a world beyond her experiencing and may occasionally leave her reality.
I don’t see how making our past less memorable is desirable—you might choose to fade certain memories, but in general there’s no obvious benefit to making all memories weaker. It seems that you would be destroying things (memories) that we (apparently) valued, and doing it for no particular reason.
I can see that if you got really really bored you might like to cycle through variations on your favorite realities without losing novelty, but in that case it seems like you would want to try almost everything else first… you are basically giving up on personal progress in favor of hedonism.
You might also question, once you’ve reached the point of being a preferential waker (that is, you aren’t doing it as some sort of therapy, but because you honestly prefer it), if personal identity over ‘wakes’ is a real thing anymore.
I don’t think it is any more horryfing then being stuck in one reality, treasuring memories. It is certainly less horrifying then our current human existence with prospects of death, suffering, boredom, heartache, etc. Your fear seems to just be about something different than you’re used to.
But you’re always stuck in one reality.
Let’s take a step back, and ask ourselves what’s really going on here. It’s an interesting idea, for which I thank you; I might use it in a story. But...
By living your life in this way, you’d be divorcing yourself from reality. There is a real world, and if you’re interacting solely with these artificial worlds you’re not interacting with it. That’s what sets off my “no way, no how” alert, in part because it seems remarkably dangerous; anything might happen, your computing infrastructure might get stolen from underneath you, and you wouldn’t necessarily know.
Disclaimer: This comment may sound very crackpottish. I promise the ideas in it aren’t as wonky as they seem, but it would be to hard to explain them properly in such short time.
Here comes the notion that in posthumanism there is no definite reality. Reality is a product of experiences and how your choices influence those experiences. In posthumanism however you can modify it freely. What we call reality is a very local phenomenon.
Anyhow, it’s not the case that your computing infrastructure would be in danger—it would be either protected by some powerful AI, much better suited to protecting your infrastructure then you or there would be other copies of you keeping the maintenance in “meatspace” (Again, I strongly believe that it’s only our contemporary perspective that makes us feel that reality in which computations are performed is more real then virtual reality).
What’s more, a Waker can be perfectly aware that there is a world beyond her experiencing and may occasionally leave her reality.
I don’t see how making our past less memorable is desirable—you might choose to fade certain memories, but in general there’s no obvious benefit to making all memories weaker. It seems that you would be destroying things (memories) that we (apparently) valued, and doing it for no particular reason.
I can see that if you got really really bored you might like to cycle through variations on your favorite realities without losing novelty, but in that case it seems like you would want to try almost everything else first… you are basically giving up on personal progress in favor of hedonism.
You might also question, once you’ve reached the point of being a preferential waker (that is, you aren’t doing it as some sort of therapy, but because you honestly prefer it), if personal identity over ‘wakes’ is a real thing anymore.