Cryonics is the practice of preserving people who are dying in liquid nitrogen soon after their heart stops. The idea is that most of your brain’s information content is still intact right after you’ve “died”, i.e. medical death or legal death. If humans invent molecular nanotechnology or brain emulation techniques, it may be possible to reconstruct the consciousness of cryopreserved patients.
Related: Life Extension, a more general tag about ways to avoid death.
Cryonics-associated issues commonly raised on LessWrong
Pro-cryonics points
Advanced reductionism/physicalism (because of the issues associated with identifying a person with continuity of brain information).
Whether an extended healthy lifespan is worthwhile (relates to Fun Theory, religious rationalizations for 70-year lifespans, “sour grapes” rationalizations for why death is actually a good thing).
The “shut up and multiply” aspect of spending $300/year (as Eliezer Yudkowsky quotes his costs for Cryonics Institute membership ($125/year) plus term life insurance ($180/year)) for a probability (how large being widely disputed) of obtaining many more years of lifespan. For this reason, cryonics advocates regard it as an extreme case of failure at rationality—a low-hanging fruit by which millions of deaths per year could be prevented at low cost.
Anti-cryonics points
Cognitive biases contributing to emotional prejudice in favor of cryonics (optimistic bias, motivated cognition).
The multiply chained nature of the probabilities involved in cryonics, and whether the final expected utility is worth the cost.
Money spent on cryonics could, arguably, be better spent on efficient charity.
S-risks/hyperexistential risks; The far future may turn out to be dystopian and have negative expected value.
Notable Posts
We Agree: Get Froze by Robin Hanson. “My co-blogger Eliezer and I may disagree on AI fooms, but we agree on something quite contrarian and, we think, huge: More likely than not, most folks who die today didn’t have to die! … It seems far more people read this blog daily than have ever signed up for cryonics. While it is hard to justify most medical procedures using standard health economics calculations, such calculations say that at today’s prices cryonics seems a good deal even if you think there’s only a 5% chance it’ll work.”
You Only Live Twice by Eliezer Yudkowsky. “My co-blogger Robin and I may disagree on how fast an AI can improve itself, but we agree on an issue that seems much simpler to us than that: At the point where the current legal and medical system gives up on a patient, they aren’t really dead.”
The Pascal’s Wager Fallacy Fallacy—the fallacy of Pascal’s Wager combines a high payoff with a privileged hypothesis, one with low prior probability and no particular reason to believe it. Perceptually seeing an instance of “Pascal’s Wager” just from the high payoff, even when the probability is not small, is the Pascal’s Wager Fallacy Fallacy.
Normal Cryonics—On the shift of perspective that came from attending a gathering of normal-seeming young cryonicists.
That Magical Click—What is the unexplained process whereby some people get cryonics, or other frequently-derailed chains of thought, in a very short time?
Quantum Mechanics and Personal Identity by Eliezer Yudkowsky. A shortened index into the Quantum Physics Sequence describing only the prerequisite knowledge to understand the statement that “science can rule out a notion of personal identity that depends on your being composed of the same atoms—because modern physics has taken the concept of ‘same atom’ and thrown it out the window. There are no little billiard balls with individual identities. It’s experimentally ruled out.” The key post in this sequence is Timeless Identity, in which “Having used physics to completely trash all naive theories of identity, we reassemble a conception of persons and experiences from what is left” but this finale might make little sense without the prior discussion.
Break Cryonics Down by Robin Hanson—tries to identify some of the chained probabilities involved in cryonics.
Third Alternatives for Afterlife-ism by Eliezer Yudkowsky—explains why cryonics is a third option in the dilemma about whether we should tell noble lies about an afterlife, to prevent people from getting depressed by not believing in an afterlife.
A survey of anti-cryonics writing by ciphergoth—an attempt to find quality criticism of cryonics, with a surprising result that “there is not one person who has ever taken the time to read and understand cryonics claims in any detail, still considers it pseudoscience, and has written a paper, article or even a blog post to rebut anything that cryonics advocates actually say”.
From the old discussion page:
Talk:Cryonics
Since this is not Wikipedia, we should go further than just being encyclopedic, and give readers specific advice about the particulars of cryonics such as availability, cost, and lifestyle tips to maximize the probability of being frozen successfully. CannibalSmith 21:57, 8 November 2009 (UTC)
The size of the article should be within bounds. What do you mean by not being encyclopedic? As much as possible, the claims on the wiki should be well-understood and supported by discussion on the blog. ==Vladimir Nesov 22:28, 8 November 2009 (UTC)
Consent to Revivers
I haven’t yet heard something about this particular concern of mine but I think there’s a cheap way to raise the probability of survival through cryonics by a significant percentage.
This is a question of Paperwork and Consent. Many preserved hope for a reanimation by a friendly superintelligent AI which took over the world. However many people want to limit the power of these proposed beings by forcing them to slow down their operations to negotiate with humans. Thus when you sign up you should give written and recorded spoken consent and orders to any and all entities to revive them by any means necessary, be it brain uploading or rebuilding of the body. Also they should identify themselves as humans and subjects to the entities utility function, no matter how long it has been since they last thought something. I hereby consent, in full control of my mind under the influence of no drug to every entity, to any procedure which lets me continue thinking. I also declare myself a person and subject to their utility function no matter what anybody declared about my status. I do not identify myself as dead now or in the future. Theonebutcher (talk) 08:01, 10 October 2013 (EST)