I agree that driving is more concrete, and thus slightly easier to find real numbers about.
The difference in likelihood between immortality-and-resurrection ASI vs immortality-without-resurrection ASI seems to me to be smaller than the difference in likelihood between “ASI is possible” and “ASI as we imagine it is impossible for some reason we haven’t discovered yet”. (for “ASI as we imagine it” being a superintelligence that both can and wants to make us immortal, the “is impossible” might be as simple as it deciding that there’s some watertight ethical case against immortality which we just weren’t smart enough to figure out)
I think that guesstimating an actual likelihood that an ASI which could offer immortality couldn’t offer resurrection is a worthwhile exercise in reasoning about the limits of the hypothetical ASI, which would in turn offer a structure for reasoning about the likelihood that an ASI might never exist, or that it might exist and decide that giving us eternal happiness or immortality or whatever is actually not a good idea.
Thank you for this. The idea of “if you die before the singularity but are signed up for cryonics you might be revived” didn’t really register with me until now. I feel silly. It’s a hugely important thing that I overlooked. It’s kinda shaken me. Currently, avoiding death is a pretty big thing for me, but given this, it may not be something worth prioritizing so much. Let me try to play with some numbers.
I suppose we’re just multiplying by the probability of immortality without resurrection. Eg. if I die right now, let’s ignore the ~50 years of pre-singularity life I lose and focus on me losing the 10% chance of living 100k post-singularity years. Or an expectation of 10k years. But I only lose those 10k years if it’s immortality without resurrection. So what is the probability of immortality without resurrection? Suppose it’s 30%. Then the expectation is 3k years instead of 10k.
Furthermore, if it’s immortality without resurrection, I think those life years are more likely to be unpleasant. I might not even want to be living those life years. Doesn’t immortality without resurrection indicate pretty strongly that the AI is unfriendly? In which case, it wouldn’t make sense to go to great lengths trying to avoid death, eg. by not riding in cars.
On the other hand, when people die in car accidents, it seems like the type of thing where your brain could be damaged enough such that you wouldn’t be able to be cryonically frozen. Hm, this feels pretty cruxy. There’s gotta be at least a 10% chance that the car accident that kills you would also prevent you from being cryonically frozen, right? If so, we’re only cutting things down by an order of magnitude. That seems like a lower bound. In realize, I’d think that it’s more like a 50% chance.
I agree that driving is more concrete, and thus slightly easier to find real numbers about.
The difference in likelihood between immortality-and-resurrection ASI vs immortality-without-resurrection ASI seems to me to be smaller than the difference in likelihood between “ASI is possible” and “ASI as we imagine it is impossible for some reason we haven’t discovered yet”. (for “ASI as we imagine it” being a superintelligence that both can and wants to make us immortal, the “is impossible” might be as simple as it deciding that there’s some watertight ethical case against immortality which we just weren’t smart enough to figure out)
I think that guesstimating an actual likelihood that an ASI which could offer immortality couldn’t offer resurrection is a worthwhile exercise in reasoning about the limits of the hypothetical ASI, which would in turn offer a structure for reasoning about the likelihood that an ASI might never exist, or that it might exist and decide that giving us eternal happiness or immortality or whatever is actually not a good idea.
Thank you for this. The idea of “if you die before the singularity but are signed up for cryonics you might be revived” didn’t really register with me until now. I feel silly. It’s a hugely important thing that I overlooked. It’s kinda shaken me. Currently, avoiding death is a pretty big thing for me, but given this, it may not be something worth prioritizing so much. Let me try to play with some numbers.
I suppose we’re just multiplying by the probability of immortality without resurrection. Eg. if I die right now, let’s ignore the ~50 years of pre-singularity life I lose and focus on me losing the 10% chance of living 100k post-singularity years. Or an expectation of 10k years. But I only lose those 10k years if it’s immortality without resurrection. So what is the probability of immortality without resurrection? Suppose it’s 30%. Then the expectation is 3k years instead of 10k.
Furthermore, if it’s immortality without resurrection, I think those life years are more likely to be unpleasant. I might not even want to be living those life years. Doesn’t immortality without resurrection indicate pretty strongly that the AI is unfriendly? In which case, it wouldn’t make sense to go to great lengths trying to avoid death, eg. by not riding in cars.
On the other hand, when people die in car accidents, it seems like the type of thing where your brain could be damaged enough such that you wouldn’t be able to be cryonically frozen. Hm, this feels pretty cruxy. There’s gotta be at least a 10% chance that the car accident that kills you would also prevent you from being cryonically frozen, right? If so, we’re only cutting things down by an order of magnitude. That seems like a lower bound. In realize, I’d think that it’s more like a 50% chance.