Consider—the most important factor after technical feasibility in whether you get revived is if future society thinks this is the right thing to do. For that reason, absolutely everything which might negatively affect social opinion of the frozen should be avoided.
I.. am not even sure making yourself a particularly appealing target for revival is clever. If people bring you back first because you were famous you might get to alpha-test that procedure. If you get brought back because you were infamous.. Courses of action which might result in you waking up in a virtual “reeducation and rehabilitation” camp are to be avoided. And that is one of the more lenient options. Once your brain is being fed into a nano-scale analyzer, you better hope the people operating that machinery mean you well. They probably will—the world generally isn’t getting more hostile, but some of the above ideas would make even very honest and idealistic authorities just reprogram you.
I’m not sure I completely agree with you, but I’d argue that is exactly the sort of discussion which I am surprised is not already happening. Consider:
I should not make myself an appealing target for resurrection, because I am likely to receive the procedure in the most ‘pre-alpha’ form
versus
I should make myself the most appealing target for ressurection possible; history shows that if a procedure is expensive or difficult (like going to the moon) it is usually only done infrequently until technology catches up with ambition. The longer I am frozen, the more chance something happens to catastrophically prevent my resurrection, so I desire to be revived in the first wave.
Alternatively
Future society is likely to punish (or refuse to revive) those who were evil in this life, so I should only adopt strategies which reflect well on me
versus
Future society is likely to reprogramme people who were evil in this life before reviving them, so I should maximise my chance of making it to future society by any means necessary; it won’t affect my chances of revivification because almost every current human will need reprogramming before revival.
While it happens my probability distribution over what future society looks like is a lot closer to yours than what you might infer from the main body of the post, your belief about future society is hugely important in determining your optimal freezing strategy. This is why I say I am surprised there is not more discussion already happening; cryonics correlates well with people who have carefully considered the question of what future society is likely to look like, but it appears many people have not then made a link between the two sets of beliefs.
Short of being responsible for a major genocide or something, I think it’s staggeringly unlikely that anything you do in your day-to-day life will affect your chances of getting unfrozen, should you choose to follow the cryonics route.
Consider a silly but analogous situation: while laying the foundation for a new monument in Atlanta, Georgia, builders break into an underground vault containing the bodies of a team of Egyptologists from the early 1800s, placed into suspended animation by an irate mummy’s curse. Understanding of Middle Egyptian has gotten better since they were entombed, though, and contemporary interpretations of artifacts buried with them suggest that moderately expensive sacrifices on their behalf to Aten, an aspect of the sun god, will break their curses.
Now, as it happens, some members of the team were slave-owners at the time of their entombment—a serious violation of contemporary ethics, to say the least! Given that they all came from the same cultural context and shared roughly the same mores, though, I view it as implausible that this would substantially affect their chances of getting uncursed. And that’s a pretty extreme example. If one of the porters had a background as a thief, I doubt anyone would even consider it as a factor.
But as a side note, if there was a mystical amulet in the tomb which could remove all sympathies for slavery from a person’s mind—don’t you think we would use it on them?
I don’t know which way the decision would fall, but I definitely don’t think the answer would be an immediate and unambiguous “yes”; effective, involuntary modification of people’s ethics isn’t something our culture has ever had to deal with in reality, and when it’s come up in fiction (e.g. Nineteen Eighty-Four) it’s usually been treated negatively. We’re comfortable with encouraging endogenous ethical change by way of perspective or incentives (carrot and stick both), but that’s not quite the same thing.
You could make a consequential case for it, of course.
You could make a consequential case for it, of course
Certainly true, and disturbing, especially for those of us who feel that consequentialism is in some way “correct”. Since far-future people are virtually guaranteed to have radically different values than us, and likely would have the ability to directly modify our (to them frighteningly evil) values, wouldn’t we (per murder-Gandhi) want to spread a deontological system that forbids tampering with other people’s values, even if we feel that in general consequentialism based on our current society’s values is more morally beneficial? That is, would we prefer for some small spark of our moral system to survive into the distant future, at the expense of it being lost in the here and now?
Hmm. Interesting question, but I’d say no; a forced conversion can’t affect a person’s actual convictions, only their ritual performance and other aspects of outward behavior. That might over time lead to changes in convictions, but that would be more analogous to our slave-owners upthread being exposed to modern society and learning in good after-school special form that slavery is bad, mmkay?
Consider—the most important factor after technical feasibility in whether you get revived is if future society thinks this is the right thing to do. For that reason, absolutely everything which might negatively affect social opinion of the frozen should be avoided. I.. am not even sure making yourself a particularly appealing target for revival is clever. If people bring you back first because you were famous you might get to alpha-test that procedure. If you get brought back because you were infamous.. Courses of action which might result in you waking up in a virtual “reeducation and rehabilitation” camp are to be avoided. And that is one of the more lenient options. Once your brain is being fed into a nano-scale analyzer, you better hope the people operating that machinery mean you well. They probably will—the world generally isn’t getting more hostile, but some of the above ideas would make even very honest and idealistic authorities just reprogram you.
I’m not sure I completely agree with you, but I’d argue that is exactly the sort of discussion which I am surprised is not already happening. Consider:
I should not make myself an appealing target for resurrection, because I am likely to receive the procedure in the most ‘pre-alpha’ form
versus
I should make myself the most appealing target for ressurection possible; history shows that if a procedure is expensive or difficult (like going to the moon) it is usually only done infrequently until technology catches up with ambition. The longer I am frozen, the more chance something happens to catastrophically prevent my resurrection, so I desire to be revived in the first wave.
Alternatively
Future society is likely to punish (or refuse to revive) those who were evil in this life, so I should only adopt strategies which reflect well on me
versus
Future society is likely to reprogramme people who were evil in this life before reviving them, so I should maximise my chance of making it to future society by any means necessary; it won’t affect my chances of revivification because almost every current human will need reprogramming before revival.
While it happens my probability distribution over what future society looks like is a lot closer to yours than what you might infer from the main body of the post, your belief about future society is hugely important in determining your optimal freezing strategy. This is why I say I am surprised there is not more discussion already happening; cryonics correlates well with people who have carefully considered the question of what future society is likely to look like, but it appears many people have not then made a link between the two sets of beliefs.
Short of being responsible for a major genocide or something, I think it’s staggeringly unlikely that anything you do in your day-to-day life will affect your chances of getting unfrozen, should you choose to follow the cryonics route.
Consider a silly but analogous situation: while laying the foundation for a new monument in Atlanta, Georgia, builders break into an underground vault containing the bodies of a team of Egyptologists from the early 1800s, placed into suspended animation by an irate mummy’s curse. Understanding of Middle Egyptian has gotten better since they were entombed, though, and contemporary interpretations of artifacts buried with them suggest that moderately expensive sacrifices on their behalf to Aten, an aspect of the sun god, will break their curses.
Now, as it happens, some members of the team were slave-owners at the time of their entombment—a serious violation of contemporary ethics, to say the least! Given that they all came from the same cultural context and shared roughly the same mores, though, I view it as implausible that this would substantially affect their chances of getting uncursed. And that’s a pretty extreme example. If one of the porters had a background as a thief, I doubt anyone would even consider it as a factor.
But as a side note, if there was a mystical amulet in the tomb which could remove all sympathies for slavery from a person’s mind—don’t you think we would use it on them?
I don’t know which way the decision would fall, but I definitely don’t think the answer would be an immediate and unambiguous “yes”; effective, involuntary modification of people’s ethics isn’t something our culture has ever had to deal with in reality, and when it’s come up in fiction (e.g. Nineteen Eighty-Four) it’s usually been treated negatively. We’re comfortable with encouraging endogenous ethical change by way of perspective or incentives (carrot and stick both), but that’s not quite the same thing.
You could make a consequential case for it, of course.
Certainly true, and disturbing, especially for those of us who feel that consequentialism is in some way “correct”. Since far-future people are virtually guaranteed to have radically different values than us, and likely would have the ability to directly modify our (to them frighteningly evil) values, wouldn’t we (per murder-Gandhi) want to spread a deontological system that forbids tampering with other people’s values, even if we feel that in general consequentialism based on our current society’s values is more morally beneficial? That is, would we prefer for some small spark of our moral system to survive into the distant future, at the expense of it being lost in the here and now?
Don’t forced religious conversions (especially mass ones) qualify?
Hmm. Interesting question, but I’d say no; a forced conversion can’t affect a person’s actual convictions, only their ritual performance and other aspects of outward behavior. That might over time lead to changes in convictions, but that would be more analogous to our slave-owners upthread being exposed to modern society and learning in good after-school special form that slavery is bad, mmkay?