I’ve been thinking about this a lot for similar reasons, and one thing I’ve been concerned about is the market for human whole-brain emulations. A lot of the examples in this thread are horror stories that seem to me to lack clear pathways to their actual realization. (That would be bad, but how would it actually happen?) However, this seems like a very plausible pathway to some very bad futures for cryonics patients. There has always been an active market for humans (and I do mean as property, not for human services). It was moral progress to ban slavery, but slavery remains in the modern guise of human trafficking. Why should we not expect to observe the same phenomenon in a future with WBEs? The ideal solution in my eyes is a Friendly AI that enforces ethical regulations on WBEs as quasi-natural law, but even that brings up some ethical questions about personal freedom and privacy. It would be a mistake to say that WBE is more likely to come before AI, but we are more certain about WBE timelines than superintelligent AI timelines, and our desideratum is the plausibility that a given emerging technology will cause bad futures for cryonics patients. There are clear brute-force pathways to WBE: increase scanning resolution and make the primitives in your model as fundamental as possible so that you can trade computation for understanding. If there is not already a singleton to regulate WBEs when they are invented, then all it takes is one person who has access to the emulation and the means to copy it. It’s conceivable that for many purposes, emulations would be quite fungible, so having access to only one would not significantly affect demand. Consider also that even a state of affairs like this that lasted only a few objective minutes for merely one emulation would amount to several subjective millenia of suffering. So it seems that, if our argument is valid, this is yet another way in which the order in which emerging technologies arrive heavily affects whether or not one should become a cryonics patient.
Given my idea of how things like this might happen, it might actually be extremely helpful to specify that you only be revived under very particular circumstances; the best example I have off of the top of my head is to elect to be revived only if it is possible to regenerate your physical body. Not because you’re entirely avoiding being digitized (the nanobots have to know how to rebuild you!), but because the way to avoid being a victim of the demand for WBEs is to not become one until there are inviolable regulations in place protecting you from being surreptitiously copied.
I’ve been thinking about this a lot for similar reasons, and one thing I’ve been concerned about is the market for human whole-brain emulations. A lot of the examples in this thread are horror stories that seem to me to lack clear pathways to their actual realization. (That would be bad, but how would it actually happen?) However, this seems like a very plausible pathway to some very bad futures for cryonics patients. There has always been an active market for humans (and I do mean as property, not for human services). It was moral progress to ban slavery, but slavery remains in the modern guise of human trafficking. Why should we not expect to observe the same phenomenon in a future with WBEs? The ideal solution in my eyes is a Friendly AI that enforces ethical regulations on WBEs as quasi-natural law, but even that brings up some ethical questions about personal freedom and privacy. It would be a mistake to say that WBE is more likely to come before AI, but we are more certain about WBE timelines than superintelligent AI timelines, and our desideratum is the plausibility that a given emerging technology will cause bad futures for cryonics patients. There are clear brute-force pathways to WBE: increase scanning resolution and make the primitives in your model as fundamental as possible so that you can trade computation for understanding. If there is not already a singleton to regulate WBEs when they are invented, then all it takes is one person who has access to the emulation and the means to copy it. It’s conceivable that for many purposes, emulations would be quite fungible, so having access to only one would not significantly affect demand. Consider also that even a state of affairs like this that lasted only a few objective minutes for merely one emulation would amount to several subjective millenia of suffering. So it seems that, if our argument is valid, this is yet another way in which the order in which emerging technologies arrive heavily affects whether or not one should become a cryonics patient.
Given my idea of how things like this might happen, it might actually be extremely helpful to specify that you only be revived under very particular circumstances; the best example I have off of the top of my head is to elect to be revived only if it is possible to regenerate your physical body. Not because you’re entirely avoiding being digitized (the nanobots have to know how to rebuild you!), but because the way to avoid being a victim of the demand for WBEs is to not become one until there are inviolable regulations in place protecting you from being surreptitiously copied.