“There are negative possibilities (woken up in dystopia and not allowed to die) but they are exotic, not having equal probability weight to counterbalance the positive possibilities.”
That doesn’t seem at all obvious to me. First, our current society doesn’t allow people to die, although today law enforcement is spotty enough that they can’t really prevent it. I assume far future societies will have excellent law enforcement, including mind reading and total surveillance (unless libertarians seriously get their act together in the next hundred years). I don’t see any reason why the taboo on suicide must disappear. And any society advanced enough to revive me has by definition conquered death, so I can’t just wait it out and die of old age. I place about 50% odds on not being able to die again after I get out.
I’m also less confident the future wouldn’t be a dystopia. Even in the best case scenario the future’s going to be scary through sheer cultural drift (see: legalized rape in Three Worlds Collide). I don’t have to tell you that it’s easier to get a Singularity that goes horribly wrong than one that goes just right, and even if we restrict the possibilities to those where I get revived instead of turned into paperclips, they could still be pretty grim (what about some well-intentioned person hard-coding in “Promote and protect human life” to an otherwise poorly designed AI, and ending up with something that resurrects the cryopreserved...and then locks them in little boxes for all eternity so they don’t consume unnecessary resources.) And then there’s just the standard fears of some dictator or fundamentalist theocracy, only this time armed with mind control and total surveillance so there’s no chance of overthrowing them.
The deal-breaker is that I really, really don’t want to live forever. I might enjoy living a thousand years, but not forever. You could change my mind if you had a utopian post-singularity society that completely mastered Fun Theory. But when I compare the horrible possibility of being forced to live forever either in a dystopia or in a world no better or worse than our own, to the good possibility of getting to live between thousand years and forever in a Fun Theory utopia that can keep me occupied...well, the former seems both more probable and more extreme.
The threat of dystopia stresses the importance of finding or making a trustworthy, durable institution that will relocate/destroy your body if the political system starts becoming grim.
Of course there is no such thing. Boards can become infiltrated. Missions can drift. Hostile (or even well-intentioned) outside agents can act suddenly before your guardian institution can respond.
But there may be measures you can take to reduce fell risk to acceptable levels (i.e: levels comparable to current risk of exposure to, as Yudkowsky mentioned, secret singularity-in-a-basement):
You could make contracts with (multiple) members of the younger generation of cryonicists, on condition that they contract with their younger generation, etc. to guard your body throughout the ages.
You can hide a very small bomb in your body that continues to countdown slowly even while frozen (don’t know if we have the technology yet, but it doesn’t sound too sophisticated) so as to limit the amount of divergence from now that you are willing to expose yourself to [explosion small enough to destroy your brain, but not the brain next to you].
You can have your body hidden and known only to cryonicist leaders.
You can have your body’s destruction forged.
I don’t think any combination of THESE suggestions will suffice. But it is worth very much effort inventing more (and not necessarily sharing them all online), and making them possible if you are considering freezing yourself.
There is a minuscule probability that during the next 10 seconds, nanomachines produced by a fresh GAI sweep in through your window and capture you for infinite life and thus, by your argument, infinite hell. Building on your argumentation, the case can be made that you should strive to minimize the probability of that outcome. Therefore, suicide.
Edit: My point has already been made by Eliezer. Lets see how this retracting thingy works.
The threat of dystopia stresses the importance of finding or making a trustworthy, durable institution that will relocate/destroy your body if the political system starts becoming grim.
Of course there is no such thing. Boards can become infiltrated. Missions can drift. Hostile (or even well-intentioned) outside agents can act suddenly before your guardian institution can respond.
But there may be measures you can take to reduce fell risk to acceptable levels. You could make contracts with (multiple) members of the younger generation of cryonicists, on condition that they contract with their younger generation, etc. to guard your body throughout the ages. You can hide a very small bomb in your body that continues to countdown slowly even while frozen (don’t know if we have the technology yet, but it doesn’t sound too sophisticated) so as to limit the amount of divergence from now that you are willing to expose yourself to [explosion small enough to destroy your brain, but not the brain next to you] You can have your body hidden and known only to cryonicist leaders. You can have your body’s destruction forged.
No matter what arrangements you make, if you choose to freeze yourself you can never get the probability of being indefinitely tortured upon reanimation down to zero. So what is an acceptable level of risk? I’ll give you a lower bound: the probability that a terrorist group has already secretly figured out how to extend life indefinitely, and is on route to kidnap you now.
I don’t think all the suggestions I made put together will suffice. But it is worth very much effort inventing more (and not necessarily sharing them all online), and making them possible if you are considering freezing yourself.
“There are negative possibilities (woken up in dystopia and not allowed to die) but they are exotic, not having equal probability weight to counterbalance the positive possibilities.”
That doesn’t seem at all obvious to me. First, our current society doesn’t allow people to die, although today law enforcement is spotty enough that they can’t really prevent it. I assume far future societies will have excellent law enforcement, including mind reading and total surveillance (unless libertarians seriously get their act together in the next hundred years). I don’t see any reason why the taboo on suicide must disappear. And any society advanced enough to revive me has by definition conquered death, so I can’t just wait it out and die of old age. I place about 50% odds on not being able to die again after I get out.
I’m also less confident the future wouldn’t be a dystopia. Even in the best case scenario the future’s going to be scary through sheer cultural drift (see: legalized rape in Three Worlds Collide). I don’t have to tell you that it’s easier to get a Singularity that goes horribly wrong than one that goes just right, and even if we restrict the possibilities to those where I get revived instead of turned into paperclips, they could still be pretty grim (what about some well-intentioned person hard-coding in “Promote and protect human life” to an otherwise poorly designed AI, and ending up with something that resurrects the cryopreserved...and then locks them in little boxes for all eternity so they don’t consume unnecessary resources.) And then there’s just the standard fears of some dictator or fundamentalist theocracy, only this time armed with mind control and total surveillance so there’s no chance of overthrowing them.
The deal-breaker is that I really, really don’t want to live forever. I might enjoy living a thousand years, but not forever. You could change my mind if you had a utopian post-singularity society that completely mastered Fun Theory. But when I compare the horrible possibility of being forced to live forever either in a dystopia or in a world no better or worse than our own, to the good possibility of getting to live between thousand years and forever in a Fun Theory utopia that can keep me occupied...well, the former seems both more probable and more extreme.
The threat of dystopia stresses the importance of finding or making a trustworthy, durable institution that will relocate/destroy your body if the political system starts becoming grim.
Of course there is no such thing. Boards can become infiltrated. Missions can drift. Hostile (or even well-intentioned) outside agents can act suddenly before your guardian institution can respond.
But there may be measures you can take to reduce fell risk to acceptable levels (i.e: levels comparable to current risk of exposure to, as Yudkowsky mentioned, secret singularity-in-a-basement):
You could make contracts with (multiple) members of the younger generation of cryonicists, on condition that they contract with their younger generation, etc. to guard your body throughout the ages.
You can hide a very small bomb in your body that continues to countdown slowly even while frozen (don’t know if we have the technology yet, but it doesn’t sound too sophisticated) so as to limit the amount of divergence from now that you are willing to expose yourself to [explosion small enough to destroy your brain, but not the brain next to you].
You can have your body hidden and known only to cryonicist leaders.
You can have your body’s destruction forged.
I don’t think any combination of THESE suggestions will suffice. But it is worth very much effort inventing more (and not necessarily sharing them all online), and making them possible if you are considering freezing yourself.
Hi.
There is a minuscule probability that during the next 10 seconds, nanomachines produced by a fresh GAI sweep in through your window and capture you for infinite life and thus, by your argument, infinite hell. Building on your argumentation, the case can be made that you should strive to minimize the probability of that outcome. Therefore, suicide.
Edit: My point has already been made by Eliezer. Lets see how this retracting thingy works.
The threat of dystopia stresses the importance of finding or making a trustworthy, durable institution that will relocate/destroy your body if the political system starts becoming grim.
Of course there is no such thing. Boards can become infiltrated. Missions can drift. Hostile (or even well-intentioned) outside agents can act suddenly before your guardian institution can respond.
But there may be measures you can take to reduce fell risk to acceptable levels. You could make contracts with (multiple) members of the younger generation of cryonicists, on condition that they contract with their younger generation, etc. to guard your body throughout the ages. You can hide a very small bomb in your body that continues to countdown slowly even while frozen (don’t know if we have the technology yet, but it doesn’t sound too sophisticated) so as to limit the amount of divergence from now that you are willing to expose yourself to [explosion small enough to destroy your brain, but not the brain next to you] You can have your body hidden and known only to cryonicist leaders. You can have your body’s destruction forged.
No matter what arrangements you make, if you choose to freeze yourself you can never get the probability of being indefinitely tortured upon reanimation down to zero. So what is an acceptable level of risk? I’ll give you a lower bound: the probability that a terrorist group has already secretly figured out how to extend life indefinitely, and is on route to kidnap you now.
I don’t think all the suggestions I made put together will suffice. But it is worth very much effort inventing more (and not necessarily sharing them all online), and making them possible if you are considering freezing yourself.