>then it does little good to post hypotheses such as “they are too scared of false hope”
What would you recommend? I spend time talking to such people, but I’m just one person and I wish there were more theory about this. I’m surprised LW isn’t interested.
>Maybe some are, but certainly not all.
What do view as the main factors?
>many of these cannot be reliably averted by a group of well-meaning people
Which ones are you thinking of? Some that I’m worried about:
-- forceful intervention (e.g. by a state or invader) that prevents people from putting LN in the dewars
-- collapse of civilization, such that it’s too expensive to produce LN and even people who care a lot can’t rig something together (have you looked into this? do you know under what circumstances this might happen?)
-- X-risk, e.g. AGI risk, killing everyone (in particular, preventing humanity from developing nanotech etc.)
> investigation of cryonic technology, the social context in which it occurs, and expectations for the next 50 years
Which seem like key points of failure? You said there’s lots of points of failure, but this doesn’t give an estimate, if we expand conjunctions we have to also expand disjunctions. For example, I could be worried that I might die in a way that makes my cryo-preservation worse—I’m left dead for a few days, there’s physical trauma to my brain, etc. This does decrease the probability of success, but also we have to factor in uncertainty about how good preservation needs to be; it’s not just P(perfect preservation)xP(humanity makes it to nanotech), it’s also additionally P(bad preservation)xP(nanotech)xP(souls can be well reconstructed from bad preservations).
>then it does little good to post hypotheses such as “they are too scared of false hope”
What would you recommend? I spend time talking to such people, but I’m just one person and I wish there were more theory about this. I’m surprised LW isn’t interested.
>Maybe some are, but certainly not all.
What do view as the main factors?
>many of these cannot be reliably averted by a group of well-meaning people
Which ones are you thinking of? Some that I’m worried about:
-- forceful intervention (e.g. by a state or invader) that prevents people from putting LN in the dewars
-- collapse of civilization, such that it’s too expensive to produce LN and even people who care a lot can’t rig something together (have you looked into this? do you know under what circumstances this might happen?)
-- X-risk, e.g. AGI risk, killing everyone (in particular, preventing humanity from developing nanotech etc.)
> investigation of cryonic technology, the social context in which it occurs, and expectations for the next 50 years
Which seem like key points of failure? You said there’s lots of points of failure, but this doesn’t give an estimate, if we expand conjunctions we have to also expand disjunctions. For example, I could be worried that I might die in a way that makes my cryo-preservation worse—I’m left dead for a few days, there’s physical trauma to my brain, etc. This does decrease the probability of success, but also we have to factor in uncertainty about how good preservation needs to be; it’s not just P(perfect preservation)xP(humanity makes it to nanotech), it’s also additionally P(bad preservation)xP(nanotech)xP(souls can be well reconstructed from bad preservations).