Thank you for this. The idea of “if you die before the singularity but are signed up for cryonics you might be revived” didn’t really register with me until now. I feel silly. It’s a hugely important thing that I overlooked. It’s kinda shaken me. Currently, avoiding death is a pretty big thing for me, but given this, it may not be something worth prioritizing so much. Let me try to play with some numbers.
I suppose we’re just multiplying by the probability of immortality without resurrection. Eg. if I die right now, let’s ignore the ~50 years of pre-singularity life I lose and focus on me losing the 10% chance of living 100k post-singularity years. Or an expectation of 10k years. But I only lose those 10k years if it’s immortality without resurrection. So what is the probability of immortality without resurrection? Suppose it’s 30%. Then the expectation is 3k years instead of 10k.
Furthermore, if it’s immortality without resurrection, I think those life years are more likely to be unpleasant. I might not even want to be living those life years. Doesn’t immortality without resurrection indicate pretty strongly that the AI is unfriendly? In which case, it wouldn’t make sense to go to great lengths trying to avoid death, eg. by not riding in cars.
On the other hand, when people die in car accidents, it seems like the type of thing where your brain could be damaged enough such that you wouldn’t be able to be cryonically frozen. Hm, this feels pretty cruxy. There’s gotta be at least a 10% chance that the car accident that kills you would also prevent you from being cryonically frozen, right? If so, we’re only cutting things down by an order of magnitude. That seems like a lower bound. In realize, I’d think that it’s more like a 50% chance.
Thank you for this. The idea of “if you die before the singularity but are signed up for cryonics you might be revived” didn’t really register with me until now. I feel silly. It’s a hugely important thing that I overlooked. It’s kinda shaken me. Currently, avoiding death is a pretty big thing for me, but given this, it may not be something worth prioritizing so much. Let me try to play with some numbers.
I suppose we’re just multiplying by the probability of immortality without resurrection. Eg. if I die right now, let’s ignore the ~50 years of pre-singularity life I lose and focus on me losing the 10% chance of living 100k post-singularity years. Or an expectation of 10k years. But I only lose those 10k years if it’s immortality without resurrection. So what is the probability of immortality without resurrection? Suppose it’s 30%. Then the expectation is 3k years instead of 10k.
Furthermore, if it’s immortality without resurrection, I think those life years are more likely to be unpleasant. I might not even want to be living those life years. Doesn’t immortality without resurrection indicate pretty strongly that the AI is unfriendly? In which case, it wouldn’t make sense to go to great lengths trying to avoid death, eg. by not riding in cars.
On the other hand, when people die in car accidents, it seems like the type of thing where your brain could be damaged enough such that you wouldn’t be able to be cryonically frozen. Hm, this feels pretty cruxy. There’s gotta be at least a 10% chance that the car accident that kills you would also prevent you from being cryonically frozen, right? If so, we’re only cutting things down by an order of magnitude. That seems like a lower bound. In realize, I’d think that it’s more like a 50% chance.