Do people buy cryonics because they really expect to be revived or because it is a way to signal a set of beliefs, values and allegiance to a specific community, akin to a religious burial ritual? My impression is that for most cryonicists, at least the ones on LessWrong, the latter motivation is dominant. This is consistent with current cryonics practices not being optimized to maximize revival chances.
I agree the fact that current cryonics practice is not optimised for revival is extremely strong Baysian evidence (for me at least) that most cryonicists on these forums are considerably more likely to be signalling than rationally trying to live forever. I would add into that the well known problem of ‘cryo-crastinating’, which is hard to explain if pro-cryo individuals are highly rational life-year maximisers, but extremely easy to explain if people are willing to send a ‘pro-cryo’ signal when it is free, but not when it is expensive.
On the other hand, I am convinced at least some cryonics advocates will find a discussion on cryonics strategy genuinely useful and important, and since I was reading about cryonics anyway I thought I would fill what I considered to be a gap in the debate landscape.
On the other hand, I am convinced at least some cryonics advocates will find a discussion on cryonics strategy genuinely useful and important, and since I was reading about cryonics anyway I thought I would fill what I considered to be a gap in the debate landscape.
Ok. My comment wasn’t intended as a criticism to your post, it was just my two cents on cryonics.
Current cryonics organizations may not be optimized in a Bayesian sense; but I don’t have much of a way to nudge my cryonics organization of choice into more rational behaviour, let alone any other cryo groups. One possibility is that cryo groups are dominated by traditional, non-Bayesian rationalists, who really are trying to live forever, but just aren’t applying the familiar-to-us techniques from the Sequences to accomplish that goal. :)
I’m not as smart as I wish I were; I just couldn’t think of a better way to refer to that general area of idea-space, especially given that the local audience consists primarily of LW-readers who likely already know the terms. If choosing a vocabulary for a target audience counts as ‘signalling’, is there any communication that /doesn’t/?
Anyway, I’m trying to live forced or die trying; and if a truck hits me tomorrow, there’s not much more I can do to prepare other than have already signed up for cryo. Nudging my chosen cryo group to improve my odds is more of a long-term thing, and I’m just getting started. Heck, I don’t even qualify to be elected to the board until I’ve been a member for another year or two.
I’m not as smart as I wish I were; I just couldn’t think of a better way to refer to that general area of idea-space, especially given that the local audience consists primarily of LW-readers who likely already know the terms.
Well, I’m also not as smart as I wish I were, and I don’t know what you mean by “Bayesian rationalist”, or “optimized in a Bayesian sense”, but they do look like combinations of local buzz words. If you were tying to use them to make an argument, then I didn’t get it.
if a truck hits me tomorrow, there’s not much more I can do to prepare other than have already signed up for cryo.
If a truck hits you tomorrow and you die, then you will most likely suffer massive brain damage before any cryopreservation procedure can be attempted, hence I wouldn’t worry about it if I were you.
Do people buy cryonics because they really expect to be revived or because it is a way to signal a set of beliefs, values and allegiance to a specific community, akin to a religious burial ritual?
My impression is that for most cryonicists, at least the ones on LessWrong, the latter motivation is dominant. This is consistent with current cryonics practices not being optimized to maximize revival chances.
I agree the fact that current cryonics practice is not optimised for revival is extremely strong Baysian evidence (for me at least) that most cryonicists on these forums are considerably more likely to be signalling than rationally trying to live forever. I would add into that the well known problem of ‘cryo-crastinating’, which is hard to explain if pro-cryo individuals are highly rational life-year maximisers, but extremely easy to explain if people are willing to send a ‘pro-cryo’ signal when it is free, but not when it is expensive.
On the other hand, I am convinced at least some cryonics advocates will find a discussion on cryonics strategy genuinely useful and important, and since I was reading about cryonics anyway I thought I would fill what I considered to be a gap in the debate landscape.
Ok. My comment wasn’t intended as a criticism to your post, it was just my two cents on cryonics.
Current cryonics organizations may not be optimized in a Bayesian sense; but I don’t have much of a way to nudge my cryonics organization of choice into more rational behaviour, let alone any other cryo groups. One possibility is that cryo groups are dominated by traditional, non-Bayesian rationalists, who really are trying to live forever, but just aren’t applying the familiar-to-us techniques from the Sequences to accomplish that goal. :)
“Bayesian rationalists”, “the Sequences”..., that is the kind of signalling I was talking about.
I’m not as smart as I wish I were; I just couldn’t think of a better way to refer to that general area of idea-space, especially given that the local audience consists primarily of LW-readers who likely already know the terms. If choosing a vocabulary for a target audience counts as ‘signalling’, is there any communication that /doesn’t/?
Anyway, I’m trying to live forced or die trying; and if a truck hits me tomorrow, there’s not much more I can do to prepare other than have already signed up for cryo. Nudging my chosen cryo group to improve my odds is more of a long-term thing, and I’m just getting started. Heck, I don’t even qualify to be elected to the board until I’ve been a member for another year or two.
Well, I’m also not as smart as I wish I were, and I don’t know what you mean by “Bayesian rationalist”, or “optimized in a Bayesian sense”, but they do look like combinations of local buzz words. If you were tying to use them to make an argument, then I didn’t get it.
If a truck hits you tomorrow and you die, then you will most likely suffer massive brain damage before any cryopreservation procedure can be attempted, hence I wouldn’t worry about it if I were you.