I think it does imply subjective immortality. I’ll bite that bullet. Therefore, you should sign up for cryonics.
Consciousness isn’t continuous. There can be interruptions, like falling asleep or undergoing anesthesia. A successor mind/pattern is a conscious pattern that remembers being you. In the multiverse, any given mind has many many successors. It doesn’t have to follow immediately, or even have to follow at all, temporally. At the separations implied even for a Tegmark Level I multiverse, past and future are meaningless distinctions, since there can be no interactions.
You are your mind/pattern, not your body. A mind/pattern is independent of substrate. Your unconscious, sleeping self is not your successor mind/pattern. It’s an unconscious object that has a high probability of creating your successor (i.e. it can wake up). Same with your cryonicically-preserved corpsicle, though the probability is lower.
Any near-death event will cause grievous suffering to any barely-surviving successors, and grief and loss to friends and relatives in branches where you (objectively) don’t survive. I don’t want to suffer grievous injury, because that would hurt. I also don’t want my friends and relatives to suffer my loss. Thus, I’m reluctant to risk anything that may cause objective death.
But, the universe being a dangerous place, I can’t make that risk zero. By signing up for cryonics, I can increase the measure of successors that have a good life, even after barely surviving.
In the Multiverse, death isn’t all-or-none, black or white. A successor is a mind that remembers being you. It does not have to remember everything. If you take a drug that causes you to not form long-term memory of any event today, have you died by the next day? Objectively, no. Your friends and relatives can still talk to “you” the next day. Subjectively, partially. Your successors lack certain memories. But people forget things all the time.
Being mortal in the multiverse, you can expect that your measure of successors will continue to diminish as your branches die, but the measure never reaches absolute zero. Eventually all that remains are Bolzman Brains and the like. The most probable Boltzman brain successors only live long enough to have a “single” conscious qualia of remembering being you. The briefest of conscious thoughts. Their successors remember that thought and may have another random thought. You can eventually expect an eternity of totally random qualia and no control at all over your experience.
This isn’t Hell, but Limbo. Suffering is probably only a small corner of possible qualia-space, but so is eudaimonia. After an eternity you might stumble onto a small Botzlman World where you have some measure of control over your utility for some brief time, but that world will die, and your successors will again be only Boltzman brains.
I can’t help that some of my successors from any given moment are Boltzman brains. But I don’t want my only successors to be Boltzman Brains, because they don’t increase my utility. Therefore, cryonics.
See the Measure Problem of cosmology. I’m not certain of my answer, and I’d prefer not to bet my life on it, but it seems more likely than not. I do not believe that Boltzman Brains can be eliminated from cosmology, only that they have lesser measure than evolved beings like us. This is because of the Trivial Theorem of Arithmetic: almost all natural numbers are really damn huge. The universe doesn’t have to be infinite to get a Tegmark Level I multiverse. It just has to be sufficiently large.
I’m not sure what you’re implying. Most people close to me are not even aware that I advocate cryonics. I expect this will change once I get my finances sorted out enough to actually sign up for cryonics myself, but for most people, cryonics alone already flunks the Absurdity heuristic. Likewise with many of the perfectly rational ideas here on LW, including the logical implications of quantum mechanics and cosmology, like Subjective Immortality. Linking more “absurditiess” seems unlikely to help my case in most instances. One step at a time.
Actually, I’m just interested. I’ve been wondering if big world immortality is a subject that would make people a) think that the speaker is nuts, b) freak out and possibly go nuts or c) go nuts because they think the speaker is crazy; and whether or not it’s a bad idea to bring it up.
I think it does imply subjective immortality. I’ll bite that bullet. Therefore, you should sign up for cryonics.
Consciousness isn’t continuous. There can be interruptions, like falling asleep or undergoing anesthesia. A successor mind/pattern is a conscious pattern that remembers being you. In the multiverse, any given mind has many many successors. It doesn’t have to follow immediately, or even have to follow at all, temporally. At the separations implied even for a Tegmark Level I multiverse, past and future are meaningless distinctions, since there can be no interactions.
You are your mind/pattern, not your body. A mind/pattern is independent of substrate. Your unconscious, sleeping self is not your successor mind/pattern. It’s an unconscious object that has a high probability of creating your successor (i.e. it can wake up). Same with your cryonicically-preserved corpsicle, though the probability is lower.
Any near-death event will cause grievous suffering to any barely-surviving successors, and grief and loss to friends and relatives in branches where you (objectively) don’t survive. I don’t want to suffer grievous injury, because that would hurt. I also don’t want my friends and relatives to suffer my loss. Thus, I’m reluctant to risk anything that may cause objective death.
But, the universe being a dangerous place, I can’t make that risk zero. By signing up for cryonics, I can increase the measure of successors that have a good life, even after barely surviving.
In the Multiverse, death isn’t all-or-none, black or white. A successor is a mind that remembers being you. It does not have to remember everything. If you take a drug that causes you to not form long-term memory of any event today, have you died by the next day? Objectively, no. Your friends and relatives can still talk to “you” the next day. Subjectively, partially. Your successors lack certain memories. But people forget things all the time.
Being mortal in the multiverse, you can expect that your measure of successors will continue to diminish as your branches die, but the measure never reaches absolute zero. Eventually all that remains are Bolzman Brains and the like. The most probable Boltzman brain successors only live long enough to have a “single” conscious qualia of remembering being you. The briefest of conscious thoughts. Their successors remember that thought and may have another random thought. You can eventually expect an eternity of totally random qualia and no control at all over your experience.
This isn’t Hell, but Limbo. Suffering is probably only a small corner of possible qualia-space, but so is eudaimonia. After an eternity you might stumble onto a small Botzlman World where you have some measure of control over your utility for some brief time, but that world will die, and your successors will again be only Boltzman brains.
I can’t help that some of my successors from any given moment are Boltzman brains. But I don’t want my only successors to be Boltzman Brains, because they don’t increase my utility. Therefore, cryonics.
See the Measure Problem of cosmology. I’m not certain of my answer, and I’d prefer not to bet my life on it, but it seems more likely than not. I do not believe that Boltzman Brains can be eliminated from cosmology, only that they have lesser measure than evolved beings like us. This is because of the Trivial Theorem of Arithmetic: almost all natural numbers are really damn huge. The universe doesn’t have to be infinite to get a Tegmark Level I multiverse. It just has to be sufficiently large.
Are people close to you aware that this is a reason that you advocate cryonics?
I’m not sure what you’re implying. Most people close to me are not even aware that I advocate cryonics. I expect this will change once I get my finances sorted out enough to actually sign up for cryonics myself, but for most people, cryonics alone already flunks the Absurdity heuristic. Likewise with many of the perfectly rational ideas here on LW, including the logical implications of quantum mechanics and cosmology, like Subjective Immortality. Linking more “absurditiess” seems unlikely to help my case in most instances. One step at a time.
Actually, I’m just interested. I’ve been wondering if big world immortality is a subject that would make people a) think that the speaker is nuts, b) freak out and possibly go nuts or c) go nuts because they think the speaker is crazy; and whether or not it’s a bad idea to bring it up.