Some optimistic future scenarios speculate that we might be able to revive even those who don’t cryopreserve (current cloning techniques on preserved remnants can recreate genetic phenotypes; some sort of simulation on records of your behaviour might be able to recreate your behavioural phenotype, and so on for every part that makes up you). That applies to the pessimistic future scenarios too: if you don’t sign up for cryo, you’ll be taking a risk that the future is hellish as well.
It would be extremely surprising if our current or traditional death ceremonies are the optimal minimisation of that risk. Almost certainly, we should be trying to minimise the risk further. Cremation, destruction of records pertaining to ourselves, erasure of Facebook profile, planting deliberately false information, and other such tactics should be considered.
Does this objection strike you as reasonable, or unreasonable?
If a copy of me were made, would this instance of me experience the next instance’s experiences? I don’t think so. As far as whether I could suffer from being re-created, I doubt that. However, I’d be very concerned about future instances of me being abused, if I thought there were an interest in reviving me. If I was famous, I’d be concerned that fans might want to make a bunch of clones of me, and I’d be concerned about how the clones were treated. Unless I had reason to think that A. People are going to reconstruct me against my will and B. The people reconstructing me would do something unethical with the clones, I wouldn’t worry about it.
From the perspective that you are your instances, it matters because if you fear being abused, you would fear any instance of you being abused. You wouldn’t want to walk into an atomically precise copying machine with the knowledge that the copy is going to be used in cruel experiments.
The question becomes, where do you draw the line? Is a rough copy of you based on your facebook posts and whatever advanced AI can extrapolate from that just as worthy of you anticipating their experiences? Or perhaps you should fear ending up as that person on a relative scale depending how similar it is—if it is 50% similar, have 50% of the fear, etc. Fear and other emotions don’t have to be a simple binary relationship, after all.
Empathy is an emotion that seems to be distinctly different (meaning, it feels different and probably happens differently on a biochemical and neurological level) from the emotion of actually anticipating being the individual. So while yes I would feel empathy for any abused clones that applies regardless of the degree to which I have fear of waking up as them, it would not be the only emotion because I believe I would be the clones. Any information I had that indicates that clones might be abused in the future becomes much more near to me and I am more likely to take action on it if I think it likely that I will actually be one of them.
Thus if you think the future is bad in a way that prohibits wanting to wake up from cryonics to any serious degree, then it might be smart to be concerned for the safety of clones who could be you anyway. Since you haven’t stated a desire to be cloned, being cloned against your will is more likely to be carried out by unethical people relative to ethical people, so even if the prospect is fairly remote it is more worrying than the prospect with cryonics, where caring people must keep you frozen and do have your consent to bring you back.
I fear a rough copy of myself made from my facebook posts (and lesswrong comments) being tortured about as much as I fear an intricate 3d sculpture of me being made and then being used as a target in a gun range. Is that really just me?
Is my corpse an atomically precise copy of myself? I wouldn’t care much about that.
If you mean the classic sci-fi picture of an exact and recent clone of myself, I would certainly prefer that a copy of myself be used at a gun range than that a copy of my daughters or a few of my relatives be used. And certainly prefer that a copy of myself be used than that the single original of any of my relatives be used.
It is an ironic thing that a rationalist discussion of values comes down to questions like “how do you feel about...” Personally, much of my rational effort around values is to make choices that go against some or even many of my feelings, presumably to get at values that I think are more important. I highly value not being fooled by appearances, I highly value minimizing the extent to which I succumb to “cargo cult” reasoning. I’m not sure how much identifying myself with a copy of myself is valid (whatever that means in this context) and how much is cargo cult. But I’m pretty sure identifying myself with my corpse or a caricature of myself is cargo cult.
If you undergo dementia or some other neuro-degenerative condition for a few years, it will turn you into a very different person. A “rough” copy made from information mined from the internet could perhaps be much closer than this to the healthy version of the person than the version kept alive in a nursing home in their later years. Because of this argument, I don’t see how you can come to the conclusion that identifying with a “caricature” is cargo-cult by definition.
Your corpse is definitely not an atomically precise copy of yourself. Corpses are the subject of extensive structural damage which makes their state of unconsciousness irreversible. If this were not the case, we would neither call them corpses nor consider it unreasonable to identify with them.
A more interesting grey area would be if you were subjected to cryonics or plastination, copied while in a completely ametabolic and unconscious state, and then reanimated. You could look across at a plastic-embedded or frozen copy of yourself and not even know if they are the original. In fact, there could be many of them, implying that you are probably not the original unless you can obtain information otherwise.
If you value your original self sufficiently, that seems to imply that if say you wake up in a room with 99 other versions of you still in stasis and have a choice to a) destroy them all and live or b) suicide and reanimate them all, you should pick suicide in advance so that it becomes 99% likely your copy will pick that option.
On the other hand if you don’t care whether you are the original or a copy you can destroy all those nonsentient copies (99% chance of including the original) without worrying about it.
I’ve had success explaining cryonics to people by using the “reconstruct” (succinct term, thank you!) spectrum—on one end, maybe reconstruction is easy, and we’ll all get to live forever. On the other end, maybe it’s impossible, and you simply cannot spend more than a few days de-animated before being lost forever. In the future, there will be scientists who do research and experiments and actually determine where on the spectrum the technology actually is. Cryonics is just a particular corpse preservation method that prepares for reconstruction being difficult.
More succinctly, cryonics is trying to reach the future, and this hypothetical objection is trying to avoid the future.
I asked because it seemed that, if a fear of bad future is a reason not to try harder to reach the future, it should also be a reason to try harder avoid the future, and I was curious to examine this fear of the future.
Read this hypothetical objection:
Does this objection strike you as reasonable, or unreasonable?
If a copy of me were made, would this instance of me experience the next instance’s experiences? I don’t think so. As far as whether I could suffer from being re-created, I doubt that. However, I’d be very concerned about future instances of me being abused, if I thought there were an interest in reviving me. If I was famous, I’d be concerned that fans might want to make a bunch of clones of me, and I’d be concerned about how the clones were treated. Unless I had reason to think that A. People are going to reconstruct me against my will and B. The people reconstructing me would do something unethical with the clones, I wouldn’t worry about it.
Why do you ask?
From the perspective that you are your instances, it matters because if you fear being abused, you would fear any instance of you being abused. You wouldn’t want to walk into an atomically precise copying machine with the knowledge that the copy is going to be used in cruel experiments.
The question becomes, where do you draw the line? Is a rough copy of you based on your facebook posts and whatever advanced AI can extrapolate from that just as worthy of you anticipating their experiences? Or perhaps you should fear ending up as that person on a relative scale depending how similar it is—if it is 50% similar, have 50% of the fear, etc. Fear and other emotions don’t have to be a simple binary relationship, after all.
Empathy is an emotion that seems to be distinctly different (meaning, it feels different and probably happens differently on a biochemical and neurological level) from the emotion of actually anticipating being the individual. So while yes I would feel empathy for any abused clones that applies regardless of the degree to which I have fear of waking up as them, it would not be the only emotion because I believe I would be the clones. Any information I had that indicates that clones might be abused in the future becomes much more near to me and I am more likely to take action on it if I think it likely that I will actually be one of them.
Thus if you think the future is bad in a way that prohibits wanting to wake up from cryonics to any serious degree, then it might be smart to be concerned for the safety of clones who could be you anyway. Since you haven’t stated a desire to be cloned, being cloned against your will is more likely to be carried out by unethical people relative to ethical people, so even if the prospect is fairly remote it is more worrying than the prospect with cryonics, where caring people must keep you frozen and do have your consent to bring you back.
I fear a rough copy of myself made from my facebook posts (and lesswrong comments) being tortured about as much as I fear an intricate 3d sculpture of me being made and then being used as a target in a gun range. Is that really just me?
Nope, I’d feel the same. I think I would like to hang out with a rough copy of myself made from my internet behaviour, though.
Hmm. How do you feel about the prospect of an atomically precise copy of yourself being used as a living target at a gun range?
Is my corpse an atomically precise copy of myself? I wouldn’t care much about that.
If you mean the classic sci-fi picture of an exact and recent clone of myself, I would certainly prefer that a copy of myself be used at a gun range than that a copy of my daughters or a few of my relatives be used. And certainly prefer that a copy of myself be used than that the single original of any of my relatives be used.
It is an ironic thing that a rationalist discussion of values comes down to questions like “how do you feel about...” Personally, much of my rational effort around values is to make choices that go against some or even many of my feelings, presumably to get at values that I think are more important. I highly value not being fooled by appearances, I highly value minimizing the extent to which I succumb to “cargo cult” reasoning. I’m not sure how much identifying myself with a copy of myself is valid (whatever that means in this context) and how much is cargo cult. But I’m pretty sure identifying myself with my corpse or a caricature of myself is cargo cult.
If you undergo dementia or some other neuro-degenerative condition for a few years, it will turn you into a very different person. A “rough” copy made from information mined from the internet could perhaps be much closer than this to the healthy version of the person than the version kept alive in a nursing home in their later years. Because of this argument, I don’t see how you can come to the conclusion that identifying with a “caricature” is cargo-cult by definition.
Your corpse is definitely not an atomically precise copy of yourself. Corpses are the subject of extensive structural damage which makes their state of unconsciousness irreversible. If this were not the case, we would neither call them corpses nor consider it unreasonable to identify with them.
A more interesting grey area would be if you were subjected to cryonics or plastination, copied while in a completely ametabolic and unconscious state, and then reanimated. You could look across at a plastic-embedded or frozen copy of yourself and not even know if they are the original. In fact, there could be many of them, implying that you are probably not the original unless you can obtain information otherwise.
If you value your original self sufficiently, that seems to imply that if say you wake up in a room with 99 other versions of you still in stasis and have a choice to a) destroy them all and live or b) suicide and reanimate them all, you should pick suicide in advance so that it becomes 99% likely your copy will pick that option.
On the other hand if you don’t care whether you are the original or a copy you can destroy all those nonsentient copies (99% chance of including the original) without worrying about it.
I’ve had success explaining cryonics to people by using the “reconstruct” (succinct term, thank you!) spectrum—on one end, maybe reconstruction is easy, and we’ll all get to live forever. On the other end, maybe it’s impossible, and you simply cannot spend more than a few days de-animated before being lost forever. In the future, there will be scientists who do research and experiments and actually determine where on the spectrum the technology actually is. Cryonics is just a particular corpse preservation method that prepares for reconstruction being difficult.
More succinctly, cryonics is trying to reach the future, and this hypothetical objection is trying to avoid the future.
I asked because it seemed that, if a fear of bad future is a reason not to try harder to reach the future, it should also be a reason to try harder avoid the future, and I was curious to examine this fear of the future.