While I’m not planning to pursue cryopreservation myself, I don’t believe that it’s unreasonable to do so.
Industrial coolants came up in a conversation I was having with my parents (for reasons I am completely unable to remember), and I mentioned that I’d read a bunch of stuff about cryonics lately. My mom then half-jokingly threatened to write me out of her will if I ever signed up for it.
This seemed… disproportionately hostile. She was skeptical of the singularity and my support for the SIAI when it came up a few weeks ago, but she’s not particularly interested in the issue and didn’t make a big deal about it. It wasn’t even close to the level of scorn she apparently has for cryonics. When I asked her about it, she claimed she opposed it based on the physical impossibility of accurately copying a brain. My father and I pointed out that this would literally require the existence of magic, she conceded the point, mentioned that she still thought it was ridiculous, and changed the subject.
This was obviously a case of my mom avoiding her belief’s true weak points by not offering her true objection, rationality failures common enough to deserve blog posts pointing them out; I wasn’t shocked to observe them in the wild. What is shocking to me is that someone who is otherwise quite rational would feel so motivated to protect this particular belief about cryonics. Why is this so important?
That the overwhelming majority of those who share this intense motivation are women (it seems) just makes me more confused. I’ve seen a couple of explanations for this phenomenon, but they aren’t convincing: if these people object to cryonics because they see it as selfish (for example), why do so many of them come up with fake objections? The selfishness objection doesn’t seem like it would be something one would be penalized for making.
Wanting cryo signals disloyalty to your present allies.
Women, it seems, are especially sensitive to this (mothers, wives). Here’s my explanation for why:
Women are better than men at analyzing the social-signalling theory of actions. In fact, they (mostly) obsess about that kind of thing, e.g. watching soap operas, gossiping, people watching, etc. (disclaimer: on average)
They are less rational than men (only slightly, on average), and this is compounded by the fact that they are less knowledgeable about technical things (disclaimer: on average), especially physics, computer science, etc.
Women are more bound by social convention and less able to be lone dissenters. Asch’s conformity experiment found women to be more conforming.
Because of (2) and (3), women find it harder than men to take cryo seriously. Therefore, they are much more likely to think that it is not a feasible thing for them to do
Because they are so into analyzing social signalling, they focus in on what cryo signals about a person. Overwhelmingly: selfishness, and as they don’t think they’re going with you, betrayal.
I think that what would work is signing up before you start a relationship, and making it clear that it’s a part of who you are.
Ah, but did you notice that that did not work for Robin? (The NYT article says that Robin discussed it with Peggy when they were getting to know each other.)
It “worked” for Robin to the extent that Robin got to decide whether to marry Peggy after they discussed cryonics. Presumably they decided that they preferred each other to hypothetical spouses with the same stance on cryonics.
Aha, but if I signed up, I’d have to non-conform, darling. Think of what all the other girls at the office would say about me! It would be worse than death!
In the case of refusing cryonics, I doubt that fear of social judgment is the largest factor or even close. It’s relatively easy to avoid judgment without incurring terrible costs—many people signed up for cryonics have simply never mentioned it to the girls and boys in the office. I’m willing to bet that most people, even if you promised that their decision to choose cryonics would be entirely private, would hardly waver in their refusal.
For what it’s worth Steven Kaas emphasized social weirdness as a decent argument against signing up. I’m not sure what his reasoning was, but given that he’s Steven Kaas I’m going to update on expected evidence (that there is a significant social cost so signing up that I cannot at the moment see).
The NYT article points out that you sometimes want other people to know—your wife’s cooperation at the hospital deathbed will make it much easier for the Alcor people to wisk you away.
It’s not an argument against signing up, unless the expected utility of the decision is borderline positive and it’s specifically the increased probability of failure because of lack of additional assistance of your family that tilts the balance to the negative.
Given that there are examples of children or spouses actively preventing (and succeeding) cryopreservation, that means there’s an additional few % chance of complete failure. Given the low chance to begin with (I think another commenter says noone expects cryonics to succeed with more than 1⁄4 probability?), that damages the expected utility badly.
An additional failure mode with a few % chance of happening damages the expected utility by a few %. Unless you have some reason to think that this cause of failure is anticorrelated with other causes of failure?
If I initially estimate that cyronics in aggregate has a 10% chance of succeeding, and I then estimate that my spouse/children have a 5% chance of preventing my cryopreservation, does my expected utility decline by only 5%?
Maybe the husband/son should preemptively play the “if you don’t sign up with me, you’re betraying me” card?
If my spouse played that card too hard I’d sign up to cryonics then I’d dump them. (“Too hard” would probably mean more than one issue and persisting against clearly expressed boundaries.) Apart from the manipulative aspect it is just, well, stupid. At least manipulate me with “you will be abandoning me!” you silly man/woman/intelligent agent of choice.
Maybe the husband/son should preemptively play the “if you don’t sign up with me, you’re betraying me” card?
Voted up as an interesting suggestion. That said, I think that if anyone feels a need to be playing that card in a preemptive fashion then a relationship is probably not very functional to start with. Moreover, given that signing up is a change from the status quo I suspect that attempting to play that card would go over poorly in general.
That said, I think that if anyone feels a need to be playing that card in a preemptive fashion then a relationship is probably not very functional to start with.
Can you expand on that? I’m not sure why this particular card is any worse than what people in functional relationships typically do.
Moreover, given that signing up is a change from the status quo I suspect that attempting to play that card would go over poorly in general.
Right, so sign up before entering the relationship, then play that card. :)
I would say that if you aren’t yet married, be prepared to dump them if they won’t sign up with you. Because if they won’t, that is a strong signal to you that they are not a good spouse. These kinds of signals are important to pay attention to in the courtship process.
After marriage, you are hooked regardless of what decision they make on their own suspension arrangements, because it’s their own life. You’ve entered the contract, and the fact they want to do something stupid does not change that. But you should consider dumping them if they refuse to help with the process (at least in simple matters like calling Alcor), as that actually crosses the line into betrayal (however passive) and could get you killed.
Can you expand on that? I’m not sure why this particular card is any worse than what people in functional relationships typically do.
We may have different definitions of “functional relationship.” I’d put very high on the list of elements of a functional relationship that people don’t go out of there way to consciously manipulate each other over substantial life decisions.
Um, it’s a matter of life or death, so of course I’m going to “go out of my way”.
As for “consciously manipulate”, it seems to me that people in all relationships consciously manipulate each other all the time, in the sense of using words to form arguments in order to convince the other person to do what they want. So again, why is this particular form of manipulation not considered acceptable? Is it because you consider it a lie, that is, you don’t think you would really feel betrayed or abandoned if your significant other decided not to sign up with you? (In that case would it be ok if you did think you would feel betrayed/abandoned?) Or is it something else?
So again, why is this particular form of manipulation not considered acceptable?
It is a good question. The distinctive feature of this class of influence is the overt use of guilt and shame, combined with the projection of the speaker’s alleged emotional state onto the actual physical actions of the recipient. It is a symptom relationship dynamic that many people consider immature and unhealthy.
It is a symptom relationship dynamic that many people consider immature and unhealthy.
I’m tempted to keep asking why (ideally in terms of game theory and/or evolutionary psychology) but I’m afraid of coming across as obnoxious at this point. So let me just ask, do you think there is a better way of making the point, that from the perspective of the cryonicist, he’s not abandoning his SO, but rather it’s the other way around? Or do you think that its not worth bring up at all?
Wanting cryo signals disloyalty to your present allies.
I don’t see why you’d be showing disloyalty to those of your allies who are also choosing cryo.
Here are some more possible reasons for being opposed to cryo.
Loss aversion. “It would be really stupid to put in that hope and money and get nothing for it.”
Fear that it might be too hard to adapt to the future society. (James Halperin’s The First Immortal has it that no one gets thawed unless someone is willing to help them adapt. would that make cryo seem more or less attractive?)
And, not being an expert on women, I have no idea why there’s a substantial difference in the proportions of men and women who are opposed to cryo.
Difference between showing and signalling disloyalty. To see that it is a signal of disloyalty/lower commitment, consider what signal would be sent out by Rob saying to Ruby: “Yes, I think cryo would work, but I think life would be meaningless without you by my side, so I won’t bother”
It’s seems to also be a signal of disloyalty/lower commitment to say, “No honey, I won’t throw myself on your funeral pyre after you die.” Why don’t we similarly demand “Yes, I could keep on living, but I think life would be meaningless without you by my side, so I won’t bother” in that case?
You have to differentiate between what an individual thinks/does/decides, and what society as a whole thinks/does/decides.
For example, in a society that generally accepted that it was the “done thing” for a person to die on the funeral pyre of their partner, saying that you wanted to make a deal to buck the trend would certainly be seen as selfish.
Most individuals see the world in terms of options that are socially allowable, and signals are considered relative to what is socially allowable.
if these people object to cryonics because they see it as selfish (for example), why do so many of them come up with fake objections?
I—quite predictably—think this is a special case of the more general problem that people have trouble explaining themselves. You mom doesn’t give her real reason because she can’t (yet) articulate it. In your case, I think it’s due to two factors: 1) part of the reasoning process is something she doesn’t want to say to your face so she avoids thinking it, and 2) she’s using hidden assumptions that she falsely assumes you share.
For my part, my dad’s wife is nominally unopposed, bitterly noting that “It’s your money” and then ominously adding that, “you’ll have to talk about this with your future wife, who may find it loopy”.
(Joke’s on her—at this rate, no woman will take that job!)
Sometime ago I offered this explanation for not signing up for cryo: I know signing up would be rational, but can’t overcome my brain’s desire to make me “look normal”. I wonder whether that explanation sounds true to others here, and how many other people feel the same way.
I’m in a typical decision-paralysis state. I want to sign up, I have the money, but I’m also interested in infinite banking, which requires you to get a whole-life plan [1], which would have to be coordinated, which makes it complicated and throws off an ugh field.
What I should probably do is just get the term insurance, sign up for cryo, and then buy amendments to the life insurance contract if I want to get into the infinite banking thing.
[1] Save your breath about the “buy term and invest the difference” spiel, I’ve heard it all before. The investment environment is a joke.
I’m also interested in infinite banking, which requires you to get a whole-life plan
You mentioned this before and I had a quick look at the website and got the impression that it is fairly heavily dependent on US tax laws around whole life insurance and so is not very applicable to other countries. Have you investigated it enough to say whether my impression is accurate or if this is something that makes sense in other countries with differing tax regimes as well?
I haven’t read about the laws in other countries, but I suspect they at least share the aspect that it’s harder to seize assets stored in such a plan, giving you more time to lodge an objection of they get a lien on it.
For a variety of reasons I don’t think cryonics is a good investment for me personally. The social cost of looking weird is certainly a negative factor, though not the only one.
I don’t have anything against cryo, so this are tentative suggestions.
Maybe going in for cryo means admitting how much death hurts, so there’s a big ugh field.
Alternatively, some people are trudging through life, and they don’t want it to go on indefinitely.
Or there are people they want to get away from.
However, none of this fits with “I’ll write you out of my will”. This sounds to me like seeing cryo as a personal betrayal, but I can’t figure out what the underlying premises might be. Unless it’s that being in the will implies that the recipient will also leave money to descendants, and if you aren’t going to die, then you won’t.
If I was going to make a guess, I suspect that saying X is selfish can easily lead to the rejoinder, “It is my money I have the right to chose what to do with it,” especially within the modern world. Saying X is selfish so it shouldn’t be done, can also be seen as interfering with another persons business which is frowned upon in lots of social circles. It is also called moralising. So she may be unconsciously avoiding that response.
This may be true in some cases, but I don’t think it is in this one; my mom has no trouble moralizing on any other topic, even ones about which I care a great deal more than I do about cryonics. For example, she’s criticized polyamory as unrealistic and bisexuality as non-existent on multiple occasions, both of which have a rather significant impact on how I live my life.
I wasn’t there at the discussions, but those seem different types of statements than saying that they are “wrong/selfish” and that by implication you are a bad person for doing them. She is impugning your judgement in all cases rather than your character.
An important distinction, it’s true. I feel like it should make a difference in this situation that I declared my intention to not pursue cryopreservation, but I’m not sure that it does.
Either way, I can think of other specific occasions when my mom has specifically impugned my character as well as my judgment. (“Lazy” is the word that most immediately springs to mind, but there are others.)
It occurs to me that as I continue to add details my mom begins to look like a more and more horrible person; this is generally not the case.
when he first announced his intention to have his brain surgically removed from his freshly vacated cadaver and preserved in liquid nitrogen
I’m fairly sure that head-only preservation doesn’t involve any brain-removal. It’s interesting that in context the purpose of the phrase was to present a creepy image of cryonics, and so the bias towards the phrases that accomplish this goal won over the constraint of not generating fiction.
I wonder if Peggy’s apparent disvalue of Robin’s immortality represents a true preference, and if so, how should an FAI take it into account while computing humanity’s CEV?
It should store a canonical human “base type” in a data structure somewhere. Then it should store the information about how all humans deviate from the base type, so that they can in principle be reconstituted as if they had just been through a long sleep.
Then it should use Peggy’s body and Robin’s body for fuel.
It seems plausible that “know more” part of EV should include result of modelling of applying CEV to humanity, i.e. CEV is not just result of aggregation of individuals’ EVs, but one of fixed points of humans’ CEV after reflection on results of applying CEV.
Maybe Peggy’s model will see, that her preferences will result in unnecessary deaths and that death is no more important part for society to exist/for her children to prosper.
It seems to me if it were just some factual knowledge that Peggy is missing, Robin would have been able to fill her in and thereby change her mind.
Of course Robin’s isn’t a superintelligent being, so perhaps there is an argument that would change Peggy’s mind that Robin hasn’t thought of yet, but how certain should we be of that?
Communicating complex factual knowledge in an emotionally charged situation is hard, to say nothing of actually causing a change in deep moral responses. I don’t think failure is strong evidence for the nonexistence of such information. (Especially since I think one of the most likely sorts of knowledge to have an effect is about the origin — evolutionary and cognitive — of the relevant responses, and trying to reach an understanding of that is really hard.)
You make a good point, but why is communicating complex factual knowledge in an emotionally charged situation hard? It must be that we’re genetically programmed to block out other people’s arguments when we’re in an emotionally charged state. In other words, one explanation for why Robin has failed to change Peggy’s mind is that Peggy doesn’t want to know whatever facts or insights might change her mind on this matter. Would it be right for the FAI to ignored that “preference” and give Peggy’s model the relevant facts or insights anyway?
ETA: This does suggest a practical advice: try to teach your wife and/or mom the relevant facts and insights before bringing up the topic of cryonics.
You are underestimating just how enormously Peggy would have to change her mind. Her life’s work involves emotionally comforting people and their families through the final days of terminal illness. She has accepted her own mortality and the mortality of everyone else as one of the basic facts of life. As no one has been resurrected yet, death still remains a basic fact of life for those that don’t accept the information theoretic definition of death.
To change Peggy’s mind, Robin would not just have to convince her to accept his own cryonic suspension, but she would have to be convinced to change her life’s work—to no longer spend her working hours convincing people to accept death, but to convince them to accept death while simultaneously signing up for very expensive and very unproven crazy sounding technology.
Changing the mind of the average cryonics-opposed life partner should be a lot easier than changing Peggy’s mind. Most cryonics-opposed life partners have not dedicated their lives to something diametrically opposed to cryonics.
This does suggest a practical advice: try to teach your wife and/or mom the relevant facts and insights before bringing up the topic of cryonics.
You mean you want to make an average IQ woman into a high-grade rationalist?
Good luck!
Better plan: go with Rob Ettinger’s advice. If your wife/gf doesn’t want to play ball, dump her. (This is a more alpha-male attitude to the problem, too. A woman will instinctively sense that you are approaching her objection from an alpha-male stance of power, which will probably have more effect on her than any argument)
In fact I’m willing to bet at steep odds that Mystery could get a female partner to sign up for cryo with him, whereas a top rationalist like Hanson is floundering.
I don’t think this is about doing what you think best, it’s about allowing you to do what you think best. And yes, you should definitely threaten abandonment in these cases, or at least you’re definitely entitled to threatening and/or practicing abandonment in such cases.
Better yet, sign up while you’re single, and present it Fait accompli. It won’t get her signed up, but I’d be willing to be she won’t try to make you drop your subscription.
Well the practical advice is being offered to LW, and I’d guess that most of the people here are not average IQ, and neither are their friends and family. I personally think it’s a great idea to try and give someone the relevant factual background to understand why cryonics is desirable before bringing up the option. It probably wouldn’t work, simply because almost all attempts to sell cryonics to anyone don’t work, but it should at least decrease the probability of them reacting with a knee-jerk dismissal of the whole subject as absurd.
I maintain that if you are male with a female relatively neurotypical partner, the probability of success of making her sign on the dotted line for cryo, or accepting wholeheartedly your own cryo is not maximized by using rational argument, rather it is maximized by having an understanding of the emotional world that the fairer sex inhabit, and how to control her emotions so that she does what you think best. She won’t listen to your words, she’ll sense the emotions and level of dominance in you, and then decide based on that, and then rationalize that decision.
This is a purely positive statement, i.e. it is empirically testable, and I hereby denounce any connotation that one might interpret it to have. Let me explicitly disclaim that I don’t think that womens’ emotional nature makes them inferior, just different, and in need of different treatment. Let me also disclaim that this applies only on average, and that there will be exceptions, i.e. highly systematizing women who will, in fact, be persuaded by rational argument.
I mostly agree with you. I would even expand your point to say that if you want to convince anyone (who isn’t a perfect Bayesian) to do anything, the probability of success will almost always be higher if you use primarily emotional manipulation rather than rational argument. But cryonics inspires such strong negative emotional reactions in people that I think it would be nearly impossible to combat those with emotional manipulation of the type you describe alone. I haven’t heard of anyone choosing cryonics for themselves without having to make a rational effort to override their gut response against it, and that requires understanding the facts. Besides, I think the type of males who choose cryonics tend to have female partners of at least above-average intelligence, so that should make the explanatory process marginally less difficult.
Besides, I think the type of males who choose cryonics tend to have female partners of at least above-average intelligence, so that should make the explanatory process marginally less difficult.
Right, but the data says that it is a serious problem. Cryonics wife problem, etc.
Yes—calling it “factual knowledge” suggests it’s only about the sort of fact you could look up in the CIA World Factbook, as opposed to what we would normally call “insight”.
I meant something like embedding into culture where death is unnecessary, rather than directly arguing for that. Words aren’t best communication channel for changing moral values. Will it be enough? I hope yes, if death of carriers of moral values isn’t necessary condition for moral progress.
Edit: BTW, if CEV will be computed using humans’ reflection on its application, then it means that FAI cannot passively combine all volitions, it must search for and somehow choose fixed point. Which rule should govern that process?
Good article overall. Gives a human feel to the decision of cryonics, in particular by focusing on an unfair assault it attracts (thus appealing cryonicist’s status).
The hostile wife phenomenon doesn’t seem to have been mentioned much here. Is it less common than the article suggests or has it been glossed over because it doesn’t support the pro-cryonics position? Or has it been mentioned and I wasn’t paying attention?
At last count (a while ago admittedly), most LWers were not married, and almost none were actually signed up for cryonics. So perhaps this phenomenon just isn’t a salient issue to most people here.
Data point FWIW: my partners are far from convinced of the wisdom of cryonics, but they respect my choices. Much of the strongest opposition has come from my boyfriend, who keeps saying “why not just buy a lottery ticket? It’s cheaper”.
They’re both things with low probabilities of success, and extremely large pay-offs.
To someone with a certain view of the future, or a moderately low “maximum pay-off” threshold, the pay-off of cryonics could be the same as the pay-off for a lottery win.
At which point the lottery is a cheaper, but riskier, gamble. Again, if someone has a certain view of the future, or a “minimum probability” threshold (which both are under) then this difference in risk could be unnoticed in their thoughts.
At which point the two become identical, but one is more expensive.
It’s quick-and-dirty thinking, but it’s one easy way to end up with the connection, and doesn’t involve any utility calculations (in fact, utility calculations would be an anathema to this sort of thinking)
One big barrier I hit in talking to some of those close to me about this is that I can’t seem to explain the distinction between wanting the feeling of hope that I might live a very long time, and actually wanting to live a long time. Lots of people just say “if you want to believe in life after death, why not just go to church? It’s cheaper”.
Lots of people just say “if you want to believe in life after death, why not just go to church? It’s cheaper”.
I could see people saying that if they don’t believe that cryonics has any chance at all of working. It might be hard to tell. If I told people “there’s a good chance that cryonics will enable me to live for hundreds of years”, I’m sure many would respond by nodding, the same way they’d nod if I told them that “there’s a good chance that I’ll go to Valhalla after I die”. Sometimes respect looks like credulity, you know? Do you think that’s what’s happening here?
Apply the same transformation to my words that is causing me problems to that reply, and you get “I only want to believe in things that I believe are true”.
Lots of people just say “if you want to believe in life after death, why not just go to church? It’s cheaper”.
I could see people saying that if they don’t believe that cryonics has any chance at all of working. It might be hard to tell. If I told people “there’s a good chance that cryonics will enable me to live for hundreds of years”, I’m sure many would respond by nodding, the same way they’d nod if I told them that “there’s a good chance that I’ll go to Valhalla after I die”. Sometimes respect looks like credulity, you know? Do you think that’s what’s happening here?
I did think this was quite a likely explanation. As I’m not married the point would likely not have been terribly salient when reading about pros and cons.
A New York Times article on Robin Hanson and his wife Peggy Jackson’s disagreement on cryonics:
http://www.nytimes.com/2010/07/11/magazine/11cryonics-t.html?ref=health&pagewanted=all
While I’m not planning to pursue cryopreservation myself, I don’t believe that it’s unreasonable to do so.
Industrial coolants came up in a conversation I was having with my parents (for reasons I am completely unable to remember), and I mentioned that I’d read a bunch of stuff about cryonics lately. My mom then half-jokingly threatened to write me out of her will if I ever signed up for it.
This seemed… disproportionately hostile. She was skeptical of the singularity and my support for the SIAI when it came up a few weeks ago, but she’s not particularly interested in the issue and didn’t make a big deal about it. It wasn’t even close to the level of scorn she apparently has for cryonics. When I asked her about it, she claimed she opposed it based on the physical impossibility of accurately copying a brain. My father and I pointed out that this would literally require the existence of magic, she conceded the point, mentioned that she still thought it was ridiculous, and changed the subject.
This was obviously a case of my mom avoiding her belief’s true weak points by not offering her true objection, rationality failures common enough to deserve blog posts pointing them out; I wasn’t shocked to observe them in the wild. What is shocking to me is that someone who is otherwise quite rational would feel so motivated to protect this particular belief about cryonics. Why is this so important?
That the overwhelming majority of those who share this intense motivation are women (it seems) just makes me more confused. I’ve seen a couple of explanations for this phenomenon, but they aren’t convincing: if these people object to cryonics because they see it as selfish (for example), why do so many of them come up with fake objections? The selfishness objection doesn’t seem like it would be something one would be penalized for making.
Wanting cryo signals disloyalty to your present allies.
Women, it seems, are especially sensitive to this (mothers, wives). Here’s my explanation for why:
Women are better than men at analyzing the social-signalling theory of actions. In fact, they (mostly) obsess about that kind of thing, e.g. watching soap operas, gossiping, people watching, etc. (disclaimer: on average)
They are less rational than men (only slightly, on average), and this is compounded by the fact that they are less knowledgeable about technical things (disclaimer: on average), especially physics, computer science, etc.
Women are more bound by social convention and less able to be lone dissenters. Asch’s conformity experiment found women to be more conforming.
Because of (2) and (3), women find it harder than men to take cryo seriously. Therefore, they are much more likely to think that it is not a feasible thing for them to do
Because they are so into analyzing social signalling, they focus in on what cryo signals about a person. Overwhelmingly: selfishness, and as they don’t think they’re going with you, betrayal.
If you’re right, this suggests a useful spin on the disclosure: “I want you to run away with me—to the FUTURE!”
However, it was my dad, not my mom, who called me selfish when I brought up cryo.
I think that what would work is signing up before you start a relationship, and making it clear that it’s a part of who you are.
For parents, you can’t do this, but they’re your parents, they’ll love you through thick and thin.
Ah, but did you notice that that did not work for Robin? (The NYT article says that Robin discussed it with Peggy when they were getting to know each other.)
It “worked” for Robin to the extent that Robin got to decide whether to marry Peggy after they discussed cryonics. Presumably they decided that they preferred each other to hypothetical spouses with the same stance on cryonics.
Thanks. (Upvoted.)
Maybe the husband/son should preemptively play the “if you don’t sign up with me, you’re betraying me” card?
Aha, but if I signed up, I’d have to non-conform, darling. Think of what all the other girls at the office would say about me! It would be worse than death!
In the case of refusing cryonics, I doubt that fear of social judgment is the largest factor or even close. It’s relatively easy to avoid judgment without incurring terrible costs—many people signed up for cryonics have simply never mentioned it to the girls and boys in the office. I’m willing to bet that most people, even if you promised that their decision to choose cryonics would be entirely private, would hardly waver in their refusal.
For what it’s worth Steven Kaas emphasized social weirdness as a decent argument against signing up. I’m not sure what his reasoning was, but given that he’s Steven Kaas I’m going to update on expected evidence (that there is a significant social cost so signing up that I cannot at the moment see).
I don’t get why social weirdness is an issue. Can’t you just not tell anyone that you’ve signed up?
The NYT article points out that you sometimes want other people to know—your wife’s cooperation at the hospital deathbed will make it much easier for the Alcor people to wisk you away.
It’s not an argument against signing up, unless the expected utility of the decision is borderline positive and it’s specifically the increased probability of failure because of lack of additional assistance of your family that tilts the balance to the negative.
Given that there are examples of children or spouses actively preventing (and succeeding) cryopreservation, that means there’s an additional few % chance of complete failure. Given the low chance to begin with (I think another commenter says noone expects cryonics to succeed with more than 1⁄4 probability?), that damages the expected utility badly.
An additional failure mode with a few % chance of happening damages the expected utility by a few %. Unless you have some reason to think that this cause of failure is anticorrelated with other causes of failure?
If I initially estimate that cyronics in aggregate has a 10% chance of succeeding, and I then estimate that my spouse/children have a 5% chance of preventing my cryopreservation, does my expected utility decline by only 5%?
Are you still involved in Remember 11?
If my spouse played that card too hard I’d sign up to cryonics then I’d dump them. (“Too hard” would probably mean more than one issue and persisting against clearly expressed boundaries.) Apart from the manipulative aspect it is just, well, stupid. At least manipulate me with “you will be abandoning me!” you silly man/woman/intelligent agent of choice.
Voted up as an interesting suggestion. That said, I think that if anyone feels a need to be playing that card in a preemptive fashion then a relationship is probably not very functional to start with. Moreover, given that signing up is a change from the status quo I suspect that attempting to play that card would go over poorly in general.
Can you expand on that? I’m not sure why this particular card is any worse than what people in functional relationships typically do.
Right, so sign up before entering the relationship, then play that card. :)
I would say that if you aren’t yet married, be prepared to dump them if they won’t sign up with you. Because if they won’t, that is a strong signal to you that they are not a good spouse. These kinds of signals are important to pay attention to in the courtship process.
After marriage, you are hooked regardless of what decision they make on their own suspension arrangements, because it’s their own life. You’ve entered the contract, and the fact they want to do something stupid does not change that. But you should consider dumping them if they refuse to help with the process (at least in simple matters like calling Alcor), as that actually crosses the line into betrayal (however passive) and could get you killed.
We may have different definitions of “functional relationship.” I’d put very high on the list of elements of a functional relationship that people don’t go out of there way to consciously manipulate each other over substantial life decisions.
Um, it’s a matter of life or death, so of course I’m going to “go out of my way”.
As for “consciously manipulate”, it seems to me that people in all relationships consciously manipulate each other all the time, in the sense of using words to form arguments in order to convince the other person to do what they want. So again, why is this particular form of manipulation not considered acceptable? Is it because you consider it a lie, that is, you don’t think you would really feel betrayed or abandoned if your significant other decided not to sign up with you? (In that case would it be ok if you did think you would feel betrayed/abandoned?) Or is it something else?
It is a good question. The distinctive feature of this class of influence is the overt use of guilt and shame, combined with the projection of the speaker’s alleged emotional state onto the actual physical actions of the recipient. It is a symptom relationship dynamic that many people consider immature and unhealthy.
I’m tempted to keep asking why (ideally in terms of game theory and/or evolutionary psychology) but I’m afraid of coming across as obnoxious at this point. So let me just ask, do you think there is a better way of making the point, that from the perspective of the cryonicist, he’s not abandoning his SO, but rather it’s the other way around? Or do you think that its not worth bring up at all?
I don’t see why you’d be showing disloyalty to those of your allies who are also choosing cryo.
Here are some more possible reasons for being opposed to cryo.
Loss aversion. “It would be really stupid to put in that hope and money and get nothing for it.”
Fear that it might be too hard to adapt to the future society. (James Halperin’s The First Immortal has it that no one gets thawed unless someone is willing to help them adapt. would that make cryo seem more or less attractive?)
And, not being an expert on women, I have no idea why there’s a substantial difference in the proportions of men and women who are opposed to cryo.
Difference between showing and signalling disloyalty. To see that it is a signal of disloyalty/lower commitment, consider what signal would be sent out by Rob saying to Ruby: “Yes, I think cryo would work, but I think life would be meaningless without you by my side, so I won’t bother”
It’s seems to also be a signal of disloyalty/lower commitment to say, “No honey, I won’t throw myself on your funeral pyre after you die.” Why don’t we similarly demand “Yes, I could keep on living, but I think life would be meaningless without you by my side, so I won’t bother” in that case?
You have to differentiate between what an individual thinks/does/decides, and what society as a whole thinks/does/decides.
For example, in a society that generally accepted that it was the “done thing” for a person to die on the funeral pyre of their partner, saying that you wanted to make a deal to buck the trend would certainly be seen as selfish.
Most individuals see the world in terms of options that are socially allowable, and signals are considered relative to what is socially allowable.
I—quite predictably—think this is a special case of the more general problem that people have trouble explaining themselves. You mom doesn’t give her real reason because she can’t (yet) articulate it. In your case, I think it’s due to two factors: 1) part of the reasoning process is something she doesn’t want to say to your face so she avoids thinking it, and 2) she’s using hidden assumptions that she falsely assumes you share.
For my part, my dad’s wife is nominally unopposed, bitterly noting that “It’s your money” and then ominously adding that, “you’ll have to talk about this with your future wife, who may find it loopy”.
(Joke’s on her—at this rate, no woman will take that job!)
Sometime ago I offered this explanation for not signing up for cryo: I know signing up would be rational, but can’t overcome my brain’s desire to make me “look normal”. I wonder whether that explanation sounds true to others here, and how many other people feel the same way.
I’m in a typical decision-paralysis state. I want to sign up, I have the money, but I’m also interested in infinite banking, which requires you to get a whole-life plan [1], which would have to be coordinated, which makes it complicated and throws off an ugh field.
What I should probably do is just get the term insurance, sign up for cryo, and then buy amendments to the life insurance contract if I want to get into the infinite banking thing.
[1] Save your breath about the “buy term and invest the difference” spiel, I’ve heard it all before. The investment environment is a joke.
You mentioned this before and I had a quick look at the website and got the impression that it is fairly heavily dependent on US tax laws around whole life insurance and so is not very applicable to other countries. Have you investigated it enough to say whether my impression is accurate or if this is something that makes sense in other countries with differing tax regimes as well?
I haven’t read about the laws in other countries, but I suspect they at least share the aspect that it’s harder to seize assets stored in such a plan, giving you more time to lodge an objection of they get a lien on it.
For a variety of reasons I don’t think cryonics is a good investment for me personally. The social cost of looking weird is certainly a negative factor, though not the only one.
I don’t have anything against cryo, so this are tentative suggestions.
Maybe going in for cryo means admitting how much death hurts, so there’s a big ugh field.
Alternatively, some people are trudging through life, and they don’t want it to go on indefinitely.
Or there are people they want to get away from.
However, none of this fits with “I’ll write you out of my will”. This sounds to me like seeing cryo as a personal betrayal, but I can’t figure out what the underlying premises might be. Unless it’s that being in the will implies that the recipient will also leave money to descendants, and if you aren’t going to die, then you won’t.
Is there evidence for this? Specifically the “intense” part?
ETA: Did you ask her why she had such strong feelings about it? Was she able to answer?
The evidence is largely anecdotal, I think. There are certainly stories of cryonics ending marriages out there.
I haven’t yet asked her about it, but I plan to do so next time we talk.
If I was going to make a guess, I suspect that saying X is selfish can easily lead to the rejoinder, “It is my money I have the right to chose what to do with it,” especially within the modern world. Saying X is selfish so it shouldn’t be done, can also be seen as interfering with another persons business which is frowned upon in lots of social circles. It is also called moralising. So she may be unconsciously avoiding that response.
This may be true in some cases, but I don’t think it is in this one; my mom has no trouble moralizing on any other topic, even ones about which I care a great deal more than I do about cryonics. For example, she’s criticized polyamory as unrealistic and bisexuality as non-existent on multiple occasions, both of which have a rather significant impact on how I live my life.
I wasn’t there at the discussions, but those seem different types of statements than saying that they are “wrong/selfish” and that by implication you are a bad person for doing them. She is impugning your judgement in all cases rather than your character.
An important distinction, it’s true. I feel like it should make a difference in this situation that I declared my intention to not pursue cryopreservation, but I’m not sure that it does.
Either way, I can think of other specific occasions when my mom has specifically impugned my character as well as my judgment. (“Lazy” is the word that most immediately springs to mind, but there are others.)
It occurs to me that as I continue to add details my mom begins to look like a more and more horrible person; this is generally not the case.
A factual error:
I’m fairly sure that head-only preservation doesn’t involve any brain-removal. It’s interesting that in context the purpose of the phrase was to present a creepy image of cryonics, and so the bias towards the phrases that accomplish this goal won over the constraint of not generating fiction.
I wonder if Peggy’s apparent disvalue of Robin’s immortality represents a true preference, and if so, how should an FAI take it into account while computing humanity’s CEV?
It should store a canonical human “base type” in a data structure somewhere. Then it should store the information about how all humans deviate from the base type, so that they can in principle be reconstituted as if they had just been through a long sleep.
Then it should use Peggy’s body and Robin’s body for fuel.
It seems plausible that “know more” part of EV should include result of modelling of applying CEV to humanity, i.e. CEV is not just result of aggregation of individuals’ EVs, but one of fixed points of humans’ CEV after reflection on results of applying CEV.
Maybe Peggy’s model will see, that her preferences will result in unnecessary deaths and that death is no more important part for society to exist/for her children to prosper.
It seems to me if it were just some factual knowledge that Peggy is missing, Robin would have been able to fill her in and thereby change her mind.
Of course Robin’s isn’t a superintelligent being, so perhaps there is an argument that would change Peggy’s mind that Robin hasn’t thought of yet, but how certain should we be of that?
Communicating complex factual knowledge in an emotionally charged situation is hard, to say nothing of actually causing a change in deep moral responses. I don’t think failure is strong evidence for the nonexistence of such information. (Especially since I think one of the most likely sorts of knowledge to have an effect is about the origin — evolutionary and cognitive — of the relevant responses, and trying to reach an understanding of that is really hard.)
You make a good point, but why is communicating complex factual knowledge in an emotionally charged situation hard? It must be that we’re genetically programmed to block out other people’s arguments when we’re in an emotionally charged state. In other words, one explanation for why Robin has failed to change Peggy’s mind is that Peggy doesn’t want to know whatever facts or insights might change her mind on this matter. Would it be right for the FAI to ignored that “preference” and give Peggy’s model the relevant facts or insights anyway?
ETA: This does suggest a practical advice: try to teach your wife and/or mom the relevant facts and insights before bringing up the topic of cryonics.
You are underestimating just how enormously Peggy would have to change her mind. Her life’s work involves emotionally comforting people and their families through the final days of terminal illness. She has accepted her own mortality and the mortality of everyone else as one of the basic facts of life. As no one has been resurrected yet, death still remains a basic fact of life for those that don’t accept the information theoretic definition of death.
To change Peggy’s mind, Robin would not just have to convince her to accept his own cryonic suspension, but she would have to be convinced to change her life’s work—to no longer spend her working hours convincing people to accept death, but to convince them to accept death while simultaneously signing up for very expensive and very unproven crazy sounding technology.
Changing the mind of the average cryonics-opposed life partner should be a lot easier than changing Peggy’s mind. Most cryonics-opposed life partners have not dedicated their lives to something diametrically opposed to cryonics.
You mean you want to make an average IQ woman into a high-grade rationalist?
Good luck!
Better plan: go with Rob Ettinger’s advice. If your wife/gf doesn’t want to play ball, dump her. (This is a more alpha-male attitude to the problem, too. A woman will instinctively sense that you are approaching her objection from an alpha-male stance of power, which will probably have more effect on her than any argument)
In fact I’m willing to bet at steep odds that Mystery could get a female partner to sign up for cryo with him, whereas a top rationalist like Hanson is floundering.
Is this generalizable? Should I, too, threaten my loved ones with abandonment whenever they don’t do what I think would be best?
I don’t think this is about doing what you think best, it’s about allowing you to do what you think best. And yes, you should definitely threaten abandonment in these cases, or at least you’re definitely entitled to threatening and/or practicing abandonment in such cases.
I’m not sure. It might work, but you’re going outside of my areas of expertise.
Better yet, sign up while you’re single, and present it Fait accompli. It won’t get her signed up, but I’d be willing to be she won’t try to make you drop your subscription.
Well the practical advice is being offered to LW, and I’d guess that most of the people here are not average IQ, and neither are their friends and family. I personally think it’s a great idea to try and give someone the relevant factual background to understand why cryonics is desirable before bringing up the option. It probably wouldn’t work, simply because almost all attempts to sell cryonics to anyone don’t work, but it should at least decrease the probability of them reacting with a knee-jerk dismissal of the whole subject as absurd.
I maintain that if you are male with a female relatively neurotypical partner, the probability of success of making her sign on the dotted line for cryo, or accepting wholeheartedly your own cryo is not maximized by using rational argument, rather it is maximized by having an understanding of the emotional world that the fairer sex inhabit, and how to control her emotions so that she does what you think best. She won’t listen to your words, she’ll sense the emotions and level of dominance in you, and then decide based on that, and then rationalize that decision.
This is a purely positive statement, i.e. it is empirically testable, and I hereby denounce any connotation that one might interpret it to have. Let me explicitly disclaim that I don’t think that womens’ emotional nature makes them inferior, just different, and in need of different treatment. Let me also disclaim that this applies only on average, and that there will be exceptions, i.e. highly systematizing women who will, in fact, be persuaded by rational argument.
I mostly agree with you. I would even expand your point to say that if you want to convince anyone (who isn’t a perfect Bayesian) to do anything, the probability of success will almost always be higher if you use primarily emotional manipulation rather than rational argument. But cryonics inspires such strong negative emotional reactions in people that I think it would be nearly impossible to combat those with emotional manipulation of the type you describe alone. I haven’t heard of anyone choosing cryonics for themselves without having to make a rational effort to override their gut response against it, and that requires understanding the facts. Besides, I think the type of males who choose cryonics tend to have female partners of at least above-average intelligence, so that should make the explanatory process marginally less difficult.
Right, but the data says that it is a serious problem. Cryonics wife problem, etc.
I wonder how these women feel about being labeled “The Hostile Wife Phenomenon”?
Full of righteous indignation, I should imagine. After all, they see it as their own husbands betraying them.
Yes—calling it “factual knowledge” suggests it’s only about the sort of fact you could look up in the CIA World Factbook, as opposed to what we would normally call “insight”.
I meant something like embedding into culture where death is unnecessary, rather than directly arguing for that. Words aren’t best communication channel for changing moral values. Will it be enough? I hope yes, if death of carriers of moral values isn’t necessary condition for moral progress.
Edit: BTW, if CEV will be computed using humans’ reflection on its application, then it means that FAI cannot passively combine all volitions, it must search for and somehow choose fixed point. Which rule should govern that process?
That was very nearly terrifying.
Good article overall. Gives a human feel to the decision of cryonics, in particular by focusing on an unfair assault it attracts (thus appealing cryonicist’s status).
The hostile wife phenomenon doesn’t seem to have been mentioned much here. Is it less common than the article suggests or has it been glossed over because it doesn’t support the pro-cryonics position? Or has it been mentioned and I wasn’t paying attention?
At last count (a while ago admittedly), most LWers were not married, and almost none were actually signed up for cryonics. So perhaps this phenomenon just isn’t a salient issue to most people here.
I’m married and with kids, my wife supports my (so far theoretical only) interest in cryo. Though she says she doesn’t want it for herself.
Data point FWIW: my partners are far from convinced of the wisdom of cryonics, but they respect my choices. Much of the strongest opposition has come from my boyfriend, who keeps saying “why not just buy a lottery ticket? It’s cheaper”.
Well, I hoped you showed him your expected utility calculations!
I’m afraid that isn’t really a good fit for how he thinks about these things...
It seems a bit odd to me that he would use the lottery comparison, in that case. Or no?
They’re both things with low probabilities of success, and extremely large pay-offs.
To someone with a certain view of the future, or a moderately low “maximum pay-off” threshold, the pay-off of cryonics could be the same as the pay-off for a lottery win.
At which point the lottery is a cheaper, but riskier, gamble. Again, if someone has a certain view of the future, or a “minimum probability” threshold (which both are under) then this difference in risk could be unnoticed in their thoughts.
At which point the two become identical, but one is more expensive.
It’s quick-and-dirty thinking, but it’s one easy way to end up with the connection, and doesn’t involve any utility calculations (in fact, utility calculations would be an anathema to this sort of thinking)
One big barrier I hit in talking to some of those close to me about this is that I can’t seem to explain the distinction between wanting the feeling of hope that I might live a very long time, and actually wanting to live a long time. Lots of people just say “if you want to believe in life after death, why not just go to church? It’s cheaper”.
I could see people saying that if they don’t believe that cryonics has any chance at all of working. It might be hard to tell. If I told people “there’s a good chance that cryonics will enable me to live for hundreds of years”, I’m sure many would respond by nodding, the same way they’d nod if I told them that “there’s a good chance that I’ll go to Valhalla after I die”. Sometimes respect looks like credulity, you know? Do you think that’s what’s happening here?
Yes. I’m happy that people respect my choices, but when they “respect my beliefs” it strikes me as incredibly disrespectful.
And if you reply “I only want to believe in things that are true?”
Apply the same transformation to my words that is causing me problems to that reply, and you get “I only want to believe in things that I believe are true”.
I could see people saying that if they don’t believe that cryonics has any chance at all of working. It might be hard to tell. If I told people “there’s a good chance that cryonics will enable me to live for hundreds of years”, I’m sure many would respond by nodding, the same way they’d nod if I told them that “there’s a good chance that I’ll go to Valhalla after I die”. Sometimes respect looks like credulity, you know? Do you think that’s what’s happening here?
That’s a bit scary.
It was mentioned, and you weren’t paying attention ;)
I did think this was quite a likely explanation. As I’m not married the point would likely not have been terribly salient when reading about pros and cons.