Well, no. You can’t put a body at liquid helium temperatures without massive damage (the whole vitrification trick doesn’t work as well). And liquid helium is also much harder to get and work with than liquid nitrogen. Helium is in general much rarer. The radiation shielding also will only help with background radiation. It won’t help much with radiation due to C-14 decay or due to potassium decay. Since both are in your body naturally there’s not much to do about them.
I don’t know so much about C-14, but wouldn’t potassium decay’s effects be small on timescales ~10,000 years? The radioactive natural isotope K-40 has a ridiculously long half life (1.25 billion years, which is why potassium-argon dating is popular for dating really old things) and only composes 0.012% of natural potassium. Potassium’s also much less abundant in the body than carbon—only about 140g of a 70kg person is potassium, although admittedly it might be more concentrated in the brain, which is the important part.
ETA—I did calculations, and maybe there is a problem. Suppose 0.012% of K is K-40 by mass. Then I get 0.0168 grams of K-40 in a body, which comes out as 0.00042 moles, 2.53e20 K-40 atoms. With a 1.25 billion year half life that makes 1.40e15 decays after 10,000 years. In absolute terms that’s a lot of emitted electrons and positrons. I don’t know whether the absolute number (huge) or the relative number (miniscule) is more important though.
I don’t have enough background to estimate how serious the decay would be. But with 1.40e15 decays after 10,000 years that’s around 3000 decay events a second (in practice most of that will be in the first few thousand years given that decay occurs exponentially). It seems that part of the issue also is that there’s no repair mechanism. When something is living it can take a fair bit of radiation with minimal negative results. In some circumstances living creatures can even benefit from low levels of radiation. But radiation is going to be much more damaging to cells when they can’t engage in any repairs.
Edit:Also note that the majority of the radiation that people are subject to is from potassium 40 so if this is ok then we’re generally ok. It seems that radiation is not a major limiting factor on long-term cryonic storage.
It’s true that radiation is more damaging to cells when they can’t engage in repairs. But damage is nothing to worry about in this case. When e.g. a gamma ray photon breaks a protein molecule, that molecule is rendered nonfunctional; enough such events will kill a cell. But in the context of cryonics, a broken molecule is as good as an intact one provided it’s still recognizable. Rendering it impossible to tell what the original molecule was, would take far more thorough destruction.
From Wikipedia, “The worldwide average background dose for a human being is about 2.4 millisievert (mSv) per year.” Even a lethal prompt dose is a couple of thousand times this quantity. And you can take maybe 10 times the lethal dose and still be conscious for a little while. So that’s 20,000 years of background radiation verified to not even significantly damage, let alone erase, the information in the brain. I’d be surprised if the timescale to information theoretic death by that mechanism was very much less than a billion years.
The lack of an automatic repair mechanism makes things hairier, but while frozen, the radiation damage will be localized to the cells that get hit by radiation. By the time you get the tech to revive people from cryonic freezing, you’ll most likely have the tech to fix/remove/replace the individual damaged cells before reviving someone. I think you’re right that radiation won’t be a big limiting factor, though it may be an annoying obstacle.
Ok, not so trivial. The isotope breakdown issue might be unsolvable (unless you have nanobots to go in and scrub out the unstable isotopes with?) but I would imagine that to be quite a bit less than you get from solar incidence. Liquid helium cooling doesn’t seem like it would cause information-theoretic damage, just additional cracking. Ice crystal formation is already taken care of at this point.
But liquid helium level preservation tech really does not seem likely to be needed, given how stable LN2 already gets you. The only reason to need it is if technological progression starts taking a really long time.
If you’ve got good enough nanobots to remove unstable isotopes you almost certainly have the tech to do full out repair. I don’t know if the radiation less than what you get from solar incidence. I suspect that it is but I also suspect that in a general underground environment much more radiation will be due to one’s own body than the sun.
Cracking can include information theoretic damage if it mangles up the interface at synapses badly enough. We don’t actually have a good enough understanding of how the brain stores information to really make more than very rough estimates. And cracking is also a problem for the cryonics proponents who don’t self-identify with a computerized instantiation of their brain.
Well, no. You can’t put a body at liquid helium temperatures without massive damage (the whole vitrification trick doesn’t work as well). And liquid helium is also much harder to get and work with than liquid nitrogen. Helium is in general much rarer. The radiation shielding also will only help with background radiation. It won’t help much with radiation due to C-14 decay or due to potassium decay. Since both are in your body naturally there’s not much to do about them.
I don’t know so much about C-14, but wouldn’t potassium decay’s effects be small on timescales ~10,000 years? The radioactive natural isotope K-40 has a ridiculously long half life (1.25 billion years, which is why potassium-argon dating is popular for dating really old things) and only composes 0.012% of natural potassium. Potassium’s also much less abundant in the body than carbon—only about 140g of a 70kg person is potassium, although admittedly it might be more concentrated in the brain, which is the important part.
ETA—I did calculations, and maybe there is a problem. Suppose 0.012% of K is K-40 by mass. Then I get 0.0168 grams of K-40 in a body, which comes out as 0.00042 moles, 2.53e20 K-40 atoms. With a 1.25 billion year half life that makes 1.40e15 decays after 10,000 years. In absolute terms that’s a lot of emitted electrons and positrons. I don’t know whether the absolute number (huge) or the relative number (miniscule) is more important though.
I don’t have enough background to estimate how serious the decay would be. But with 1.40e15 decays after 10,000 years that’s around 3000 decay events a second (in practice most of that will be in the first few thousand years given that decay occurs exponentially). It seems that part of the issue also is that there’s no repair mechanism. When something is living it can take a fair bit of radiation with minimal negative results. In some circumstances living creatures can even benefit from low levels of radiation. But radiation is going to be much more damaging to cells when they can’t engage in any repairs.
Edit:Also note that the majority of the radiation that people are subject to is from potassium 40 so if this is ok then we’re generally ok. It seems that radiation is not a major limiting factor on long-term cryonic storage.
It’s true that radiation is more damaging to cells when they can’t engage in repairs. But damage is nothing to worry about in this case. When e.g. a gamma ray photon breaks a protein molecule, that molecule is rendered nonfunctional; enough such events will kill a cell. But in the context of cryonics, a broken molecule is as good as an intact one provided it’s still recognizable. Rendering it impossible to tell what the original molecule was, would take far more thorough destruction.
From Wikipedia, “The worldwide average background dose for a human being is about 2.4 millisievert (mSv) per year.” Even a lethal prompt dose is a couple of thousand times this quantity. And you can take maybe 10 times the lethal dose and still be conscious for a little while. So that’s 20,000 years of background radiation verified to not even significantly damage, let alone erase, the information in the brain. I’d be surprised if the timescale to information theoretic death by that mechanism was very much less than a billion years.
The lack of an automatic repair mechanism makes things hairier, but while frozen, the radiation damage will be localized to the cells that get hit by radiation. By the time you get the tech to revive people from cryonic freezing, you’ll most likely have the tech to fix/remove/replace the individual damaged cells before reviving someone. I think you’re right that radiation won’t be a big limiting factor, though it may be an annoying obstacle.
Ok, not so trivial. The isotope breakdown issue might be unsolvable (unless you have nanobots to go in and scrub out the unstable isotopes with?) but I would imagine that to be quite a bit less than you get from solar incidence. Liquid helium cooling doesn’t seem like it would cause information-theoretic damage, just additional cracking. Ice crystal formation is already taken care of at this point.
But liquid helium level preservation tech really does not seem likely to be needed, given how stable LN2 already gets you. The only reason to need it is if technological progression starts taking a really long time.
If you’ve got good enough nanobots to remove unstable isotopes you almost certainly have the tech to do full out repair. I don’t know if the radiation less than what you get from solar incidence. I suspect that it is but I also suspect that in a general underground environment much more radiation will be due to one’s own body than the sun.
Cracking can include information theoretic damage if it mangles up the interface at synapses badly enough. We don’t actually have a good enough understanding of how the brain stores information to really make more than very rough estimates. And cracking is also a problem for the cryonics proponents who don’t self-identify with a computerized instantiation of their brain.