You are not entitled to assume a maximum disutility, even if you think you see a proof for it (see Confidence Levels Inside and Outside an Argument).
Ulysses
Karma: 8
- Ulysses 22 Jan 2011 2:44 UTC6 pointsin reply to: DanielLC’s comment on: Pascal’s Mugging: Tiny Probabilities of Vast Utilities
The threat of dystopia stresses the importance of finding or making a trustworthy, durable institution that will relocate/destroy your body if the political system starts becoming grim.
Of course there is no such thing. Boards can become infiltrated. Missions can drift. Hostile (or even well-intentioned) outside agents can act suddenly before your guardian institution can respond.
But there may be measures you can take to reduce fell risk to acceptable levels (i.e: levels comparable to current risk of exposure to, as Yudkowsky mentioned, secret singularity-in-a-basement):
You could make contracts with (multiple) members of the younger generation of cryonicists, on condition that they contract with their younger generation, etc. to guard your body throughout the ages.
You can hide a very small bomb in your body that continues to countdown slowly even while frozen (don’t know if we have the technology yet, but it doesn’t sound too sophisticated) so as to limit the amount of divergence from now that you are willing to expose yourself to [explosion small enough to destroy your brain, but not the brain next to you].
You can have your body hidden and known only to cryonicist leaders.
You can have your body’s destruction forged.
I don’t think any combination of THESE suggestions will suffice. But it is worth very much effort inventing more (and not necessarily sharing them all online), and making them possible if you are considering freezing yourself.