if you’re alive, you can kill yourself when s-risks increases beyond your comfort point. if you’re preserved, then you rely on other people to execute on those wishes
Killing oneself with high certainty of effectiveness is more difficult than most assume. The side effects on health and personal freedom of a failed attempt to end one’s life in the current era are rather extreme.
Anyways, emulating or reviving humans will always incur some cost; I suspect that those who are profitable to emulate or revive will get a lot more emulation time than those who are not.
If a future hostile agent just wants to maximize suffering, will foregoing preservation protect you from it? I think it’s far more likely that an unfriendly agent will simply disregard suffering in pursuit of some other goal. I’ve spent my regular life trying to figure out how to accomplish arbitrary goals more effectively with less suffering, so more of the same set of challenges in an afterlife would be nothing new.
Killing oneself with high certainty of effectiveness is more difficult than most assume.
Dying naturally also isn’t as smooth as plenty of people assume. I’m pretty sure that “taking things into your hands” leads to higher amount of expected suffering reduction in most cases, and it’s not informed rational analysis that prevents people from taking that option.
If a future hostile agent just wants to maximize suffering, will foregoing preservation protect you from it?
Yes? I mean, unless we entertain some extreme abstractions like it simulating all possible minds of certain complexity or whatever.
if you’re alive, you can kill yourself when s-risks increases beyond your comfort point. if you’re preserved, then you rely on other people to execute on those wishes
Killing oneself with high certainty of effectiveness is more difficult than most assume. The side effects on health and personal freedom of a failed attempt to end one’s life in the current era are rather extreme.
Anyways, emulating or reviving humans will always incur some cost; I suspect that those who are profitable to emulate or revive will get a lot more emulation time than those who are not.
If a future hostile agent just wants to maximize suffering, will foregoing preservation protect you from it? I think it’s far more likely that an unfriendly agent will simply disregard suffering in pursuit of some other goal. I’ve spent my regular life trying to figure out how to accomplish arbitrary goals more effectively with less suffering, so more of the same set of challenges in an afterlife would be nothing new.
Dying naturally also isn’t as smooth as plenty of people assume. I’m pretty sure that “taking things into your hands” leads to higher amount of expected suffering reduction in most cases, and it’s not informed rational analysis that prevents people from taking that option.
Yes? I mean, unless we entertain some extreme abstractions like it simulating all possible minds of certain complexity or whatever.