Yeah. Even though a couple of them have expressed interest, there is a huge leap from being interested to actually signing up.
This is my present plan. We’ll see if it works.
I’m not willing to bet on this.
I do not want my brain messed with. If I expected to arrive in a future that would mess with my brain without my permission, I would not want to go there.
I have to say, if 3 fails, I would tend to downvote that future pretty strongly. We seem to have very different ideas of what a revival-world will and should look like, conditional on revival working at all.
I was including a “promptly enough” in the “will make friends” thing. I’m sure that, if I could stay alive and sane long enough, I’d make friends. I don’t think I could stay alive and sane and lonely long enough to make close enough friends without my brain being messed with (not okay) or me being forcibly prevented from offing myself (not fond of this either).
If your life were literally at stake and I were a Friendly AI, I bet I could wake you up next to someone who could become fast friends with you within five hours. It doesn’t seem like a weak link in the chain, let alone the weakest one.
It is the most terrifying link in the chain. Most of the other links, if they break, just look like a dead Alicorn, not a dead Alicorn who killed herself in a fit of devastating, miserable starvation for personal connection.
If you thought it was reasonably likely that, given the success of cryonics, you’d be obliged to live without something you’d presently feel suicidal without (I’m inclined to bring up your past analogy of sex and heroin fix here, but substitute whatever works for you), would you be so gung-ho?
I could sorta understand this if we were talking about one person you couldn’t live without, it’s the idea of worrying about not having any deep friends in general that’s making me blink.
Some people are convinced they’ll have to live without the strangest things after the Singularity… having encountered something possibly similar before, I do seriously wonder if you might be suffering from a general hope-in-the-future deficiency.
If you were the friendly AI and Alicorn failed to make a fast friend as predicted and that resulted in suicidal depression, would that depression be defined as mental illness and treated as such? Would recent wake-ups have the right to commit suicide? I think that’s an incredibly hard question so please don’t answer if you don’t want to.
Have you written anything on suicide in the metaethics sequence or elsewhere?
I suppose having to rigorously prove the mathematics behind these questions is why Eliezer is so much more pessimistic about the probability of AI killing us than I am.
I have only managed to live without particular persons who’ve departed from my life for any reason by virtue of already having other persons to console me.
That said, there are a handful of people whose loss would trouble me especially terribly, but I could survive it with someone else around to grieve with.
I would think that the corporate reviving you would be either a foundation of your family, a general charity organization or a fan club of yours (Don’t laugh! There are fan clubs for super stars in India. Extend it further in the future and each LW commentor might have a fan club.) Since you will be, relatively speaking, an early adopter of cryonics, you will be relatively, a late riser. Cryonics goes LIFO, if I understand it correctly.
I’m pretty sure now that your fears are explicitly stated in a public forum, they are on the record for almost all eternity and they will be given sufficient consideration by those reviving you.
Eliezer has already presented one solution. A make-do best friend who can be upgraded to sentience whenever need be.
A simpler solution will be a human child, holding your palm and saying “I’m your great great grand child”. Are you still sure you’ll still not care enough? (Dirty mind hack, I understand, but terribly easy to implement)
I’m pretty sure now that your fears are explicitly stated in a public forum, they are on the record for almost all eternity and they will be given sufficient consideration by those reviving you.
Probably worth backing up though, in the form of a stone tablet adjacent to your body.
Alcor do keep some of your stuff in a secret location, but given problems with data retrieval from old media it might be good if they offered an explicit service to store your data—which I’d expect them to defer to providers like Amazon, but handle the long-term problems of moving to new providers as the need arises, and of decryption only on revival.
I would take the “I’m your great great grandchild” solution in a heartbeat—but I do not already have children, and something could still come up to prevent me from having them (and hence great great grandchildren).
If you’d take that solution, why not a great great … great grand niece? Or distant cousin? Any human child of that time will be related to you at some remove.
My sister doesn’t have children yet either, and may or may not in the future. It does matter if they’re a relation I’d ever be disposed to see at Christmas, which has historically bottomed out with second cousins.
It does matter if they’re a relation I’d ever be disposed to see at Christmas
Then it looks like I misunderstood. Say you have a child, then get preserved (though no one else you know does). Then say you wake up, it’s 500 years in the future, and you meet your great (great … great) great grandchild, someone you would never have seen at Christmas otherwise. Would this satisfy you?
If so, then you don’t have to worry. You will have relatives alive when you’re revived. Even if they’re descendants of cousins or second cousins. And since it will be 500 years in the future, you are equally likely to see your cousin’s 2510 descendant and your 2510 descendant at Christmas (that is, not at all).
If I had a child, I’d sign up me and said child simultaneously—problem solved right there. There’s no need to postulate any additional descendants to fix my dilemma.
I can’t get enthusiastic about second cousins 30 times removed. I wouldn’t expect to have even as much in common with them as I have in common with my second cousins now (with whom I can at least swap reminisces about prior Christmases and various relatives when the situation calls for it).
I can’t guarantee it, no, but I can be reasonably sure—someone signed up from birth (with a parent) would not have the usual defense mechanisms blocking the idea.
Then why can you get enthusiastic about a great great grandchild born after you get frozen?
I can usually think about something enough and change my feelings about it through reason.
For example, if I thought “direct descent seems special”, I could think about all the different ideas like the questions Blueberry asks and change my actual emotions about the subject.
I suspect this comes from my guilty pleasure...I glee at biting-the-bullet.
If you want make friends with cryonicists, sign up. For every one person I meet who is signed up, I hear excuses from ten others: It won’t work. It will work but I could be revived and tortured by an evil AI. The freezing process could cause insanity. It’ll probably work but I’ve been too lazy to sign up. I’m so needy I’ll kill myself without friends. Etc.
Wow, calling me names has made me really inclined to take advice from you. I’ll get right on that, since you’re so insightful about my personal qualities and must know the best thing to do in this case, too.
That would be a fine way to spend money, wouldn’t it, paying them to not let me die only for me to predictably undo their work?
My comment about suicide was a joke to contrast my recommendation: make friends.
I think you assign high probability to all of the following:
None of your current friends will ever sign up for cryonics.
You won’t make friends with any current cryonicists.
You won’t make friends after being revived.
Your suicidal neediness will be incurable by future medicine.
Please correct me if I’m wrong. If you think any of those are unlikely and you think cryonics will work, then you should sign up by yourself.
Yeah. Even though a couple of them have expressed interest, there is a huge leap from being interested to actually signing up.
This is my present plan. We’ll see if it works.
I’m not willing to bet on this.
I do not want my brain messed with. If I expected to arrive in a future that would mess with my brain without my permission, I would not want to go there.
I have to say, if 3 fails, I would tend to downvote that future pretty strongly. We seem to have very different ideas of what a revival-world will and should look like, conditional on revival working at all.
I was including a “promptly enough” in the “will make friends” thing. I’m sure that, if I could stay alive and sane long enough, I’d make friends. I don’t think I could stay alive and sane and lonely long enough to make close enough friends without my brain being messed with (not okay) or me being forcibly prevented from offing myself (not fond of this either).
If your life were literally at stake and I were a Friendly AI, I bet I could wake you up next to someone who could become fast friends with you within five hours. It doesn’t seem like a weak link in the chain, let alone the weakest one.
It is the most terrifying link in the chain. Most of the other links, if they break, just look like a dead Alicorn, not a dead Alicorn who killed herself in a fit of devastating, miserable starvation for personal connection.
If you thought it was reasonably likely that, given the success of cryonics, you’d be obliged to live without something you’d presently feel suicidal without (I’m inclined to bring up your past analogy of sex and heroin fix here, but substitute whatever works for you), would you be so gung-ho?
I could sorta understand this if we were talking about one person you couldn’t live without, it’s the idea of worrying about not having any deep friends in general that’s making me blink.
Some people are convinced they’ll have to live without the strangest things after the Singularity… having encountered something possibly similar before, I do seriously wonder if you might be suffering from a general hope-in-the-future deficiency.
PS/Edit: Spider Robinson’s analogy, not mine.
If you were the friendly AI and Alicorn failed to make a fast friend as predicted and that resulted in suicidal depression, would that depression be defined as mental illness and treated as such? Would recent wake-ups have the right to commit suicide? I think that’s an incredibly hard question so please don’t answer if you don’t want to.
Have you written anything on suicide in the metaethics sequence or elsewhere?
And the relevant question extends to the assumption behind the phrase ‘and treated as such’. Do people have the right to be nuts in general?
I suppose having to rigorously prove the mathematics behind these questions is why Eliezer is so much more pessimistic about the probability of AI killing us than I am.
I have only managed to live without particular persons who’ve departed from my life for any reason by virtue of already having other persons to console me.
That said, there are a handful of people whose loss would trouble me especially terribly, but I could survive it with someone else around to grieve with.
I would think that the corporate reviving you would be either a foundation of your family, a general charity organization or a fan club of yours (Don’t laugh! There are fan clubs for super stars in India. Extend it further in the future and each LW commentor might have a fan club.) Since you will be, relatively speaking, an early adopter of cryonics, you will be relatively, a late riser. Cryonics goes LIFO, if I understand it correctly.
I’m pretty sure now that your fears are explicitly stated in a public forum, they are on the record for almost all eternity and they will be given sufficient consideration by those reviving you.
Eliezer has already presented one solution. A make-do best friend who can be upgraded to sentience whenever need be.
A simpler solution will be a human child, holding your palm and saying “I’m your great great grand child”. Are you still sure you’ll still not care enough? (Dirty mind hack, I understand, but terribly easy to implement)
Probably worth backing up though, in the form of a stone tablet adjacent to your body.
Alcor do keep some of your stuff in a secret location, but given problems with data retrieval from old media it might be good if they offered an explicit service to store your data—which I’d expect them to defer to providers like Amazon, but handle the long-term problems of moving to new providers as the need arises, and of decryption only on revival.
I would take the “I’m your great great grandchild” solution in a heartbeat—but I do not already have children, and something could still come up to prevent me from having them (and hence great great grandchildren).
If you’d take that solution, why not a great great … great grand niece? Or distant cousin? Any human child of that time will be related to you at some remove.
My sister doesn’t have children yet either, and may or may not in the future. It does matter if they’re a relation I’d ever be disposed to see at Christmas, which has historically bottomed out with second cousins.
Then it looks like I misunderstood. Say you have a child, then get preserved (though no one else you know does). Then say you wake up, it’s 500 years in the future, and you meet your great (great … great) great grandchild, someone you would never have seen at Christmas otherwise. Would this satisfy you?
If so, then you don’t have to worry. You will have relatives alive when you’re revived. Even if they’re descendants of cousins or second cousins. And since it will be 500 years in the future, you are equally likely to see your cousin’s 2510 descendant and your 2510 descendant at Christmas (that is, not at all).
If I had a child, I’d sign up me and said child simultaneously—problem solved right there. There’s no need to postulate any additional descendants to fix my dilemma.
I can’t get enthusiastic about second cousins 30 times removed. I wouldn’t expect to have even as much in common with them as I have in common with my second cousins now (with whom I can at least swap reminisces about prior Christmases and various relatives when the situation calls for it).
You can’t guarantee that your child will go through with it, even if you sign em up.
Then why can you get enthusiastic about a great great grandchild born after you get frozen?
I can’t guarantee it, no, but I can be reasonably sure—someone signed up from birth (with a parent) would not have the usual defense mechanisms blocking the idea.
Direct descent seems special to me.
I find this thread fascinating.
I can usually think about something enough and change my feelings about it through reason.
For example, if I thought “direct descent seems special”, I could think about all the different ideas like the questions Blueberry asks and change my actual emotions about the subject.
I suspect this comes from my guilty pleasure...I glee at biting-the-bullet.
Is this not the case with you?
I do not have a reliable ability to change my emotional reactions to things in a practically useful time frame.
If you want make friends with cryonicists, sign up. For every one person I meet who is signed up, I hear excuses from ten others: It won’t work. It will work but I could be revived and tortured by an evil AI. The freezing process could cause insanity. It’ll probably work but I’ve been too lazy to sign up. I’m so needy I’ll kill myself without friends. Etc.
It gets old really fast.
Wow, calling me names has made me really inclined to take advice from you. I’ll get right on that, since you’re so insightful about my personal qualities and must know the best thing to do in this case, too.