I’m OK with the deletion of very-short-lived copies of myself if there are good reasons to do it. For example, supposing after cryonic suspension I’m revived with scanning and WBE. Unfortunately, unbeknownst to those reviving me, I have a phobia of the Michelin Man and the picture of him on the wall means I deal with the shock of my revival very badly. I’d want the revival team to just shut down, change the picture on the wall and try again.
I can also of course imagine lots of circumstances where deletion of copies would be much less morally justifiable.
I’m OK with the deletion of very-short-lived copies of myself if there are good reasons to do it.
There’s a very nice thought experiment that helps demonstrate this (I think it’s from Nozick). Imagine a sleeping pill that makes you fall asleep in thirty minutes, but you won’t remember the last fifteen minutes of being awake. From the point of view of your future self, the fifteen minutes you don’t remember is exactly like a short-lived copy that got deleted after fifteen minutes. It’s unlikely that anyone would claim taking the pill is unethical, or that you’re killing a version of yourself by doing so.
It’s unlikely that anyone would claim taking the pill is unethical, or that you’re killing a version of yourself by doing so.
Armchair reasoning: I can imagine the mental clone and the original existing at the same time, side-by-side. I cannot imagine myself with the memory loss and myself without the memory loss as existing at the same time. Also, whatever actions my past self does actually affects my future self regardless of what I remember. As such, my instinct is to think of the copy as a separate identity and my past self as the same identity.
Imagine a scenario where I cut off my arm. I am responsible. If my copy cuts off my arm, he would be responsible, not “me.”
This is all playing semantics with personal identity. I am not trying to espouse any particular belief; I am only offering one possible difference between the idea of forgetting your past and copying yourself.
Yeah, okay. You are illustrating my point exactly. Not everyone thinks the way you do about identity and not everyone thinks the way I mentioned about identity. I don’t hold hard and fast about it one way or the other.
But the original example of someone who loses 15 minutes being similar to killing off a copy who only lived for 15 minutes implies a whole ton of things about identity. The word “copy” is too ambiguous to say, “Your copy is you.”
If I switched in, “X’s copy is X” and then started talking about various cultural examples of copying we quickly run into trouble. Why does “X’s copy is X” work for people? Unless I missed a definition of terms comment or post somewhere, I don’t see how we can just assume that is true.
The first use of “copy” I found in this thread is:
Probably the “friendly” action would be to create an un-drunk copy of them, and ask the copy to decide.
It was followed by:
And what do you do with the copy? Kill it?
As best as I can tell, you take the sentence, “Your copy is you” to be a tautology or definition or something along those veins. (I could obviously be wrong; please correct me if I am.) What would you call a functionally identical version of X with a separate, distinct Identity? Is it even possible? If it is, use that instead of “copy” when reading my comment:
Imagine a scenario where I cut off my arm. I am responsible. If my copy cuts off my arm, he would be responsible, not “me.”
When I read the original comment I responded to:
From the point of view of your future self, the fifteen minutes you don’t remember is exactly like a short-lived copy that got deleted after fifteen minutes.
I was not assuming your definition of copy. Which could entirely be my fault, but I find it hard to believe that you didn’t understand my point enough to predict this response. If you did, it would have been much faster to simply say, “When people at LessWrong talk about copies they mean blah.” In which case I would have responded, “Oh, okay, that makes sense. Ignore my comment.”
The semantics get easier if you think of both as being copies, so you have past-self, copy-1, and copy-2. Then you can ask which copy is you, or if they’re both you. (If past-self is drunk, copy-1 is drunk, and copy-2 is sober, which copy is really more “you”?)
I’d actually be kinda hesitant of such pills and would need to think it out. The version of me that is in those 15 minutes might be a bit unhappy about the situation, for one thing.
And it basically results in 15 minutes of experience that simply “go away”? no gradual transition/merging into the mainline experience, simply 15 minutes that get completely wiped?
Certainly—this is the restore-from-backup scenario, for which Blueberry’s sleeping-pill comparison was apt. (I would definitely like to make a secure backup before taking a risk, personally.) What I wanted to suggest was that duplicate-for-analysis was less clear-cut.
What’s the difference? Supposing that as a matter of course the revival team try a whole bunch of different virtual environments looking for the best results, is that restore-from-backup or duplicate-for-analysis?
Suppose that we ironically find that the limitations on compute hardware mean that no matter how much we spend we hit an exact 1:1 ratio between subjective and real time, but that the hardware is super-cheap. Also, there’s no brain “merge” function. I might fork off a copy to watch a movie to review it for myself, to decide whether the “real me” should watch it.
As MrHen pointed out, you can imagine the ‘duplicate’ and ‘original’ existing side-by-side—this affects intuitions in a number of ways. To pump intuition for a moment, we consider identical twins to be different people due to the differences in their experiences, despite their being nearly identical on a macro level. I haven’t done the calculations to decide where the border of acceptable use of duplication lies, but deleting a copy which diverged from the original twenty years before clearly appears to be over the line.
It’s very hard to know how I would face the prospect of being deleted and replaced with a twenty-minute-old backup in real life!
I may be answering an un-asked question, since I haven’t been following this conversation, but the following solution to the issue of clones occurs to me:
Leave it up to the clone.
Make suicide fully legal and easily available (possibly ‘suicide of any copy of a person in cases where more than one copy exists’, though that could allow twins greater leeway depending on how you define ‘person’ - perhaps also add a time limit: the split must have occurred within N years). When a clone is created, it’s automatically given the rights to 1⁄2 of the original’s wealth. If the clone suicides, the original ‘inherits’ the wealth back. If the clone decides not to suicide, it automatically keeps the wealth that it has the rights to.
Given that a clone is functionally the same person as the original, this should be an ethical solution (assuming that you consider suicide ethical at all) - someone would have to be very sure that they’d be able to go through with suicide, or very comfortable with the idea of splitting their wealth in half, in order to be willing to take the risk of creating a clone. The only problem that I see is with unsplittable things like careers and relationships. (Flip a coin? Let the other people involved decide?)
This seems like a good solution. If I cloned myself, I’d want it to be established beforehand which copy would stay around, and which copy would go away. For instance, if you’re going to make a copy that goes to watch a movie to see if the movie is worth your time, the copy that watches the movie should go away, because if it’s good the surviving version of yourself will watch it anyway.
someone would have to be very sure that they’d be able to go through with suicide
I (and thus my clones) don’t see it as suicide, more like amnesia, so we’d have no problem going through with it if the benefit outweighed the amnesia.
If you keep the clone around, in terms of splitting their wealth, both clones can work and make money, so you should get about twice the income for less than twice the expenses (you could share some things). In terms of relationships, you could always bring the clones into a relationship. A four way relationship, made up of two copies of each original person, might be interesting.
A four way relationship, made up of two copies of each original person, might be interesting.
Hmm… *Imagines such a relationship with significant other.* Holy hell that would be weird. The amount of puzzling scenarios I can think of just by sitting here is extravagant. Does anyone know of a decent novel based on this premise?
I don’t think those kinds of situations will need to be spelled out in advance, actually. Coming up with a plan that’s acceptable to both versions of yourself before going through with the cloning should be about as easy as coming up with a plan that’s acceptable to just one version, once you’re using the right kind of framework to think about it. (You should be about equally willing to take either role, in other words, otherwise your clone is likely to rebel, and since they’re considered independent from the get-go (and not bound by any contracts they didn’t sign, I assume), there’s not much you can do about that.)
Setting up four-way relationships would definitely be interesting. Another scenario that I like is one where you make a clone to pursue an alternate life-path that you suspect might be better but think is too risky—after a year (or whatever), whichever of you is less happy could suicide and give their wealth to the other one, or both could decide that their respective paths are good and continue with half-wealth.
The more I think about this, the more I want to make a bunch of clones of myself. I don’t even see why I’d need to destroy them. I shouldn’t have to pay for them; they can get their own jobs, so wealth isn’t that much of a concern.
Coming up with a plan that’s acceptable to both versions of yourself before going through with the cloning should be about as easy as coming up with a plan that’s acceptable to just one version, once you’re using the right kind of framework to think about it.
The concern is that immediately after you clone, both copies agree that Copy 1 should live and Copy 2 should die, but afterwards, Copy 2 doesn’t want to lose those experiences. If you decide beforehand that you only want one of you around, and Copy 2 is created specifically to be destroyed, there should be a way to bind Copy 2 to suicide.
Calling it murder seems extreme, since you end up surviving. What’s the difference between binding a copy to suicide and binding yourself to take a sleep-amnesia pill?
If it’s not utterly voluntary when committed, I don’t class it as suicide. (I also consider ‘driving someone to suicide’ to actually be murder.)
My solution to resolving the ethical dilemma is, to reword it, to give the clone full human rights from the moment it’s created (actually a slightly expanded version of current human rights, since we’re currently prohibited from suiciding). I assume that it’s not currently possible to enforce a contract that will directly cause one party’s death; that aspect of inter-human interaction should remain. The wealth-split serves as a balance in two ways: Suddenly having your wealth halved would be traumatic for almost anyone, which gives a clone that had planned to suicide extra impetus to do so, and also should strongly discourage people from taking unnecessary risks when making clones. In other words, that’s not a bug, it’s a feature.
The difference between what you proposed and the sleeping pill scenario is that in the latter, there’s never a situation where an individual is deprived of rights.
If it’s not utterly voluntary when committed, I don’t class it as suicide.
I’m still unclear why you classify it as death at all. You end up surviving it.
I think you’re thinking of a each copy as an individual. I’m thinking of the copies collectively as a tool used by an individual.
The difference between what you proposed and the sleeping pill scenario is that in the latter, there’s never a situation where an individual is deprived of rights.
Ok, say you enter into a binding agreement forcing yourself to take a sleeping pill tomorrow. You have someone there to enforce it if necessary. The next day, you change your mind, and the person forces you to take the pill anyway. Have you been deprived of rights? (If it helps, substitute eating dessert, or gambling, or doing heroin for taking the pill.)
I think you’re thinking of a each copy as an individual. I’m thinking of the copies collectively as a tool used by an individual.
Yes, I am, and as far as I can tell mine’s the accurate model. Each copy is separately alive and conscious; they should no more be treated as the same individual than twins are treated as the same individual. (Otherwise, why is there any ethical question at all?)
Ok, say you enter into a binding agreement forcing yourself to take a sleeping pill tomorrow. … Have you been deprived of rights?
This kind of question comes up every so often here, and I still haven’t heard or thought of an answer that satisfies me. I don’t see it as relevant here, though, because I do recognize the clone as a separate individual who shouldn’t be coerced.
Yes, I am, and as far as I can tell mine’s the accurate model.
But if my copies and I don’t think that way, is it still accurate for us? We agree to be bound by any original agreement, and we think any of us are still alive as long as one of us is, so there’s no death involved. Well, death of a living organism, but not death of a person.
I don’t see it as relevant here, though, because I do recognize the clone as a separate individual who shouldn’t be coerced.
It’s the same question, because I’m assuming both copy A and copy B agree to be bound by the agreement immediately after copying (which is the same as the original making a plan immediately before copying). Both copies share a past, so if you can be bound by your past agreements, so can each copy. Even if the copies are separate individuals, they don’t have separate pasts.
If you and all your copies think that way, then you shouldn’t have to worry about them defecting in the first place, and the rule is irrelevant for you. How sure are you that that’s what you really believe, though? Sure enough to bet 1⁄2 your wealth?
My concern with having specific copies be bound to past agreements is that I don’t trust that people won’t abuse that: It’s easy not to see the clone as ‘yourself’, but as an easily exploitable other. Here’s a possible solution to that problem (though one that I don’t like as well as not having the clone bound by prior agreements at all): Clones can only be bound by prior agreements that randomly determine which one acts as the ‘new’ clone and which acts as the ‘old’ clone. So, if you split off a clone to go review a movie for you, and pre-bind the clone to die after reporting back, there’s a 50% chance—determined by a coin flip—that it’s you, the original, who will review the movie, and the clone who will continue with your life.
There isn’t an “original”. After the copying, there’s Copy A and Copy B. Both are me. I’m fine with randomly selecting whether Copy A or Copy B goes to see the movie, but it doesn’t matter, since they’re identical (until one sees the movie). In fact, there is no way to not randomly select which copy sees the movie.
From the point of view of the clone who sees the movie (say it’s bad), “suiciding” is the same as him going back in time and not seeing the movie. So I’d always stick to a prior agreement in a case like that.
If you and all your copies think that way, then you shouldn’t have to worry about them defecting in the first place, and the rule is irrelevant for you. How sure are you that that’s what you really believe, though? Sure enough to bet 1⁄2 your wealth?
I don’t really have any wealth to speak of. But they’re all me. If I won’t defect, then they won’t. The question is just whether or not we might disagree on what’s best for me. In which case, we can either go by prior agreement, or just let them all live. If the other mes really wanted to live, I’d let them. For instance, say I made 5 copies and all 5 of us went out to try different approaches to a career, agreeing the best one would survive. If a year later more than one claimed to have the best result for Blueberry, I might as well let more than one live.
ETA: However, there might be situations where I can only have one copy survive. For instance, I’m in a grad program now that I’d like to finish, and more than one of me can’t be enrolled for administrative reasons. So if I really need only one of me, I guess we could decide randomly which one would survive. I’m all right with forcing a copy to suicide if he changes his mind, since I’m making that decision for all the clones ahead of time to lead to the best outcome for Blueberry.
If one of the clones developed enough individuality to change his mind and disagree with the others, I definitely don’t see how you could consider that one anything other than an individual.
Likewise, if all of the clones decided to change their minds and go their separate ways, that would be functionally the same as you-as-a-single-person-with-a-single-body changing your mind about something, and the general rule there is that humans are allowed to do that, without being interfered with. I don’t see any reason to change that rule.
Be careful of generalizing from one example. I’m relatively certain that the vast majority of people who might consider cloning themselves wouldn’t see it the way you do, and would in fact need significant safeguards to protect the version of themselves who remembers waking up in a lab from being abused by the version of themselves who remembers going home after having their DNA sampled and their brain scanned.
I did have people like you in mind, at least peripherally, in my original suggestion, though: I’m fairly sure that the original proposal doesn’t take away any rights that you already have. (To the best of my knowledge, it is illegal for someone to force you to take a sleeping pill, even if you previously agreed to it, and my knowledge there is a bit better than average; remember that I worked at a nursing home.)
I’m relatively certain that the vast majority of people who might consider cloning themselves wouldn’t see it the way you do, and would in fact need significant safeguards to protect the version of themselves who remembers waking up in a lab from being abused by the version of themselves who remembers going home after having their DNA sampled and their brain scanned.
I’d like to hear more about this. First, I was imagining an identical atom-for-atom duplicate being constructed, in such a way that there is no fact of the matter who’s the original. As in, you press a button and there are two of you. I wasn’t thinking about an organism grown in a lab. But I’m not sure that matters, except that the lab scenario makes it easier to think of one copy being in control of the other copy.
You think the majority of people would worry about, and would need to worry about, one copy abusing the other copy? Why? The copies would have to fight for control first, which should be an even fight. And what would the point be?
I’m fairly sure that the original proposal doesn’t take away any rights that you already have. To the best of my knowledge, it is illegal for someone to force you to take a sleeping pill, even if you previously agreed to it.
Yes, that’s illegal except maybe in an emergency psychiatric situation. Here’s an idea: a time-delayed suicide pill, with no antidote, that one of the copies can take immediately after the cloning. That’s equivalent to having the agreement enforced, but it doesn’t take away any rights either. I think that addresses your concern.
I expect to get back to this; I had to take care of something for work and now I’m too tired to do it justice. If I haven’t responded to it within 18 hours, please remind me.
After conferring with Blueberry via PM, we agree that we’ll need to talk in realtime to get much further with this. Our schedules are both fairly busy right now, but we intend to try to turn the discussion into a top post. (I’d also be amenable to making the log public, or letting other people observe or participate, but I haven’t talked to Blue about that.)
I imagine it would be much like a case of amnesia, only with less disorientation.
Edit: Wait, I’m looking at the wrong half. One moment.
Edit: I suppose it would depend on the circumstances—“fear” is an obvious one, although mitigated to an extent by knowing that I would not be leaving a hole behind me (no grieving relatives, etc.).
Depends on how much it cost me to make it, and how much it costs to keep it around. I’m permanently busy, I’m sure I could use a couple of extra hands around the house ;)
And what do you do with the copy? Kill it?
I’m OK with the deletion of very-short-lived copies of myself if there are good reasons to do it. For example, supposing after cryonic suspension I’m revived with scanning and WBE. Unfortunately, unbeknownst to those reviving me, I have a phobia of the Michelin Man and the picture of him on the wall means I deal with the shock of my revival very badly. I’d want the revival team to just shut down, change the picture on the wall and try again.
I can also of course imagine lots of circumstances where deletion of copies would be much less morally justifiable.
There’s a very nice thought experiment that helps demonstrate this (I think it’s from Nozick). Imagine a sleeping pill that makes you fall asleep in thirty minutes, but you won’t remember the last fifteen minutes of being awake. From the point of view of your future self, the fifteen minutes you don’t remember is exactly like a short-lived copy that got deleted after fifteen minutes. It’s unlikely that anyone would claim taking the pill is unethical, or that you’re killing a version of yourself by doing so.
Armchair reasoning: I can imagine the mental clone and the original existing at the same time, side-by-side. I cannot imagine myself with the memory loss and myself without the memory loss as existing at the same time. Also, whatever actions my past self does actually affects my future self regardless of what I remember. As such, my instinct is to think of the copy as a separate identity and my past self as the same identity.
Your copy would also take actions that affects your future self. What is the difference here?
Imagine a scenario where I cut off my arm. I am responsible. If my copy cuts off my arm, he would be responsible, not “me.”
This is all playing semantics with personal identity. I am not trying to espouse any particular belief; I am only offering one possible difference between the idea of forgetting your past and copying yourself.
That doesn’t make any sense. Your copy is you.
Yeah, okay. You are illustrating my point exactly. Not everyone thinks the way you do about identity and not everyone thinks the way I mentioned about identity. I don’t hold hard and fast about it one way or the other.
But the original example of someone who loses 15 minutes being similar to killing off a copy who only lived for 15 minutes implies a whole ton of things about identity. The word “copy” is too ambiguous to say, “Your copy is you.”
If I switched in, “X’s copy is X” and then started talking about various cultural examples of copying we quickly run into trouble. Why does “X’s copy is X” work for people? Unless I missed a definition of terms comment or post somewhere, I don’t see how we can just assume that is true.
The first use of “copy” I found in this thread is:
It was followed by:
As best as I can tell, you take the sentence, “Your copy is you” to be a tautology or definition or something along those veins. (I could obviously be wrong; please correct me if I am.) What would you call a functionally identical version of X with a separate, distinct Identity? Is it even possible? If it is, use that instead of “copy” when reading my comment:
When I read the original comment I responded to:
I was not assuming your definition of copy. Which could entirely be my fault, but I find it hard to believe that you didn’t understand my point enough to predict this response. If you did, it would have been much faster to simply say, “When people at LessWrong talk about copies they mean blah.” In which case I would have responded, “Oh, okay, that makes sense. Ignore my comment.”
The semantics get easier if you think of both as being copies, so you have past-self, copy-1, and copy-2. Then you can ask which copy is you, or if they’re both you. (If past-self is drunk, copy-1 is drunk, and copy-2 is sober, which copy is really more “you”?)
Yeah, actually, that helps a lot. Using that language most of the followup questions I have obvious enough to skip bringing up. Thanks.
I’d actually be kinda hesitant of such pills and would need to think it out. The version of me that is in those 15 minutes might be a bit unhappy about the situation, for one thing.
Such pills do exist in the real world: a lot of sleeping pills have similar effects, as does consuming significant amounts of alcohol.
And it basically results in 15 minutes of experience that simply “go away”? no gradual transition/merging into the mainline experience, simply 15 minutes that get completely wiped?
eeew.
For that matter, so does falling asleep in the normal way.
Certainly—this is the restore-from-backup scenario, for which Blueberry’s sleeping-pill comparison was apt. (I would definitely like to make a secure backup before taking a risk, personally.) What I wanted to suggest was that duplicate-for-analysis was less clear-cut.
What’s the difference? Supposing that as a matter of course the revival team try a whole bunch of different virtual environments looking for the best results, is that restore-from-backup or duplicate-for-analysis?
Suppose that we ironically find that the limitations on compute hardware mean that no matter how much we spend we hit an exact 1:1 ratio between subjective and real time, but that the hardware is super-cheap. Also, there’s no brain “merge” function. I might fork off a copy to watch a movie to review it for myself, to decide whether the “real me” should watch it.
As MrHen pointed out, you can imagine the ‘duplicate’ and ‘original’ existing side-by-side—this affects intuitions in a number of ways. To pump intuition for a moment, we consider identical twins to be different people due to the differences in their experiences, despite their being nearly identical on a macro level. I haven’t done the calculations to decide where the border of acceptable use of duplication lies, but deleting a copy which diverged from the original twenty years before clearly appears to be over the line.
Absolutely, which is why I specified short-lived above.
Though it’s very hard to know how I would face the prospect of being deleted and replaced with a twenty-minute-old backup in real life!
I may be answering an un-asked question, since I haven’t been following this conversation, but the following solution to the issue of clones occurs to me:
Leave it up to the clone.
Make suicide fully legal and easily available (possibly ‘suicide of any copy of a person in cases where more than one copy exists’, though that could allow twins greater leeway depending on how you define ‘person’ - perhaps also add a time limit: the split must have occurred within N years). When a clone is created, it’s automatically given the rights to 1⁄2 of the original’s wealth. If the clone suicides, the original ‘inherits’ the wealth back. If the clone decides not to suicide, it automatically keeps the wealth that it has the rights to.
Given that a clone is functionally the same person as the original, this should be an ethical solution (assuming that you consider suicide ethical at all) - someone would have to be very sure that they’d be able to go through with suicide, or very comfortable with the idea of splitting their wealth in half, in order to be willing to take the risk of creating a clone. The only problem that I see is with unsplittable things like careers and relationships. (Flip a coin? Let the other people involved decide?)
This seems like a good solution. If I cloned myself, I’d want it to be established beforehand which copy would stay around, and which copy would go away. For instance, if you’re going to make a copy that goes to watch a movie to see if the movie is worth your time, the copy that watches the movie should go away, because if it’s good the surviving version of yourself will watch it anyway.
I (and thus my clones) don’t see it as suicide, more like amnesia, so we’d have no problem going through with it if the benefit outweighed the amnesia.
If you keep the clone around, in terms of splitting their wealth, both clones can work and make money, so you should get about twice the income for less than twice the expenses (you could share some things). In terms of relationships, you could always bring the clones into a relationship. A four way relationship, made up of two copies of each original person, might be interesting.
Hmm… *Imagines such a relationship with significant other.* Holy hell that would be weird. The amount of puzzling scenarios I can think of just by sitting here is extravagant. Does anyone know of a decent novel based on this premise?
I don’t think those kinds of situations will need to be spelled out in advance, actually. Coming up with a plan that’s acceptable to both versions of yourself before going through with the cloning should be about as easy as coming up with a plan that’s acceptable to just one version, once you’re using the right kind of framework to think about it. (You should be about equally willing to take either role, in other words, otherwise your clone is likely to rebel, and since they’re considered independent from the get-go (and not bound by any contracts they didn’t sign, I assume), there’s not much you can do about that.)
Setting up four-way relationships would definitely be interesting. Another scenario that I like is one where you make a clone to pursue an alternate life-path that you suspect might be better but think is too risky—after a year (or whatever), whichever of you is less happy could suicide and give their wealth to the other one, or both could decide that their respective paths are good and continue with half-wealth.
The more I think about this, the more I want to make a bunch of clones of myself. I don’t even see why I’d need to destroy them. I shouldn’t have to pay for them; they can get their own jobs, so wealth isn’t that much of a concern.
The concern is that immediately after you clone, both copies agree that Copy 1 should live and Copy 2 should die, but afterwards, Copy 2 doesn’t want to lose those experiences. If you decide beforehand that you only want one of you around, and Copy 2 is created specifically to be destroyed, there should be a way to bind Copy 2 to suicide.
Disagree. I would class that as murder, not suicide, and consider creating a clone who would be subject to such binding to be unethical.
Calling it murder seems extreme, since you end up surviving. What’s the difference between binding a copy to suicide and binding yourself to take a sleep-amnesia pill?
If it’s not utterly voluntary when committed, I don’t class it as suicide. (I also consider ‘driving someone to suicide’ to actually be murder.)
My solution to resolving the ethical dilemma is, to reword it, to give the clone full human rights from the moment it’s created (actually a slightly expanded version of current human rights, since we’re currently prohibited from suiciding). I assume that it’s not currently possible to enforce a contract that will directly cause one party’s death; that aspect of inter-human interaction should remain. The wealth-split serves as a balance in two ways: Suddenly having your wealth halved would be traumatic for almost anyone, which gives a clone that had planned to suicide extra impetus to do so, and also should strongly discourage people from taking unnecessary risks when making clones. In other words, that’s not a bug, it’s a feature.
The difference between what you proposed and the sleeping pill scenario is that in the latter, there’s never a situation where an individual is deprived of rights.
I’m still unclear why you classify it as death at all. You end up surviving it.
I think you’re thinking of a each copy as an individual. I’m thinking of the copies collectively as a tool used by an individual.
Ok, say you enter into a binding agreement forcing yourself to take a sleeping pill tomorrow. You have someone there to enforce it if necessary. The next day, you change your mind, and the person forces you to take the pill anyway. Have you been deprived of rights? (If it helps, substitute eating dessert, or gambling, or doing heroin for taking the pill.)
I don’t think any such agreement could be legally binding under current law, which is relevant since we’re talking about rights.
Yes, I am, and as far as I can tell mine’s the accurate model. Each copy is separately alive and conscious; they should no more be treated as the same individual than twins are treated as the same individual. (Otherwise, why is there any ethical question at all?)
This kind of question comes up every so often here, and I still haven’t heard or thought of an answer that satisfies me. I don’t see it as relevant here, though, because I do recognize the clone as a separate individual who shouldn’t be coerced.
But if my copies and I don’t think that way, is it still accurate for us? We agree to be bound by any original agreement, and we think any of us are still alive as long as one of us is, so there’s no death involved. Well, death of a living organism, but not death of a person.
It’s the same question, because I’m assuming both copy A and copy B agree to be bound by the agreement immediately after copying (which is the same as the original making a plan immediately before copying). Both copies share a past, so if you can be bound by your past agreements, so can each copy. Even if the copies are separate individuals, they don’t have separate pasts.
If you and all your copies think that way, then you shouldn’t have to worry about them defecting in the first place, and the rule is irrelevant for you. How sure are you that that’s what you really believe, though? Sure enough to bet 1⁄2 your wealth?
My concern with having specific copies be bound to past agreements is that I don’t trust that people won’t abuse that: It’s easy not to see the clone as ‘yourself’, but as an easily exploitable other. Here’s a possible solution to that problem (though one that I don’t like as well as not having the clone bound by prior agreements at all): Clones can only be bound by prior agreements that randomly determine which one acts as the ‘new’ clone and which acts as the ‘old’ clone. So, if you split off a clone to go review a movie for you, and pre-bind the clone to die after reporting back, there’s a 50% chance—determined by a coin flip—that it’s you, the original, who will review the movie, and the clone who will continue with your life.
There isn’t an “original”. After the copying, there’s Copy A and Copy B. Both are me. I’m fine with randomly selecting whether Copy A or Copy B goes to see the movie, but it doesn’t matter, since they’re identical (until one sees the movie). In fact, there is no way to not randomly select which copy sees the movie.
From the point of view of the clone who sees the movie (say it’s bad), “suiciding” is the same as him going back in time and not seeing the movie. So I’d always stick to a prior agreement in a case like that.
I don’t really have any wealth to speak of. But they’re all me. If I won’t defect, then they won’t. The question is just whether or not we might disagree on what’s best for me. In which case, we can either go by prior agreement, or just let them all live. If the other mes really wanted to live, I’d let them. For instance, say I made 5 copies and all 5 of us went out to try different approaches to a career, agreeing the best one would survive. If a year later more than one claimed to have the best result for Blueberry, I might as well let more than one live.
ETA: However, there might be situations where I can only have one copy survive. For instance, I’m in a grad program now that I’d like to finish, and more than one of me can’t be enrolled for administrative reasons. So if I really need only one of me, I guess we could decide randomly which one would survive. I’m all right with forcing a copy to suicide if he changes his mind, since I’m making that decision for all the clones ahead of time to lead to the best outcome for Blueberry.
Response to ETA:
If one of the clones developed enough individuality to change his mind and disagree with the others, I definitely don’t see how you could consider that one anything other than an individual.
Likewise, if all of the clones decided to change their minds and go their separate ways, that would be functionally the same as you-as-a-single-person-with-a-single-body changing your mind about something, and the general rule there is that humans are allowed to do that, without being interfered with. I don’t see any reason to change that rule.
Be careful of generalizing from one example. I’m relatively certain that the vast majority of people who might consider cloning themselves wouldn’t see it the way you do, and would in fact need significant safeguards to protect the version of themselves who remembers waking up in a lab from being abused by the version of themselves who remembers going home after having their DNA sampled and their brain scanned.
I did have people like you in mind, at least peripherally, in my original suggestion, though: I’m fairly sure that the original proposal doesn’t take away any rights that you already have. (To the best of my knowledge, it is illegal for someone to force you to take a sleeping pill, even if you previously agreed to it, and my knowledge there is a bit better than average; remember that I worked at a nursing home.)
I’d like to hear more about this. First, I was imagining an identical atom-for-atom duplicate being constructed, in such a way that there is no fact of the matter who’s the original. As in, you press a button and there are two of you. I wasn’t thinking about an organism grown in a lab. But I’m not sure that matters, except that the lab scenario makes it easier to think of one copy being in control of the other copy.
You think the majority of people would worry about, and would need to worry about, one copy abusing the other copy? Why? The copies would have to fight for control first, which should be an even fight. And what would the point be?
Yes, that’s illegal except maybe in an emergency psychiatric situation. Here’s an idea: a time-delayed suicide pill, with no antidote, that one of the copies can take immediately after the cloning. That’s equivalent to having the agreement enforced, but it doesn’t take away any rights either. I think that addresses your concern.
Next up: a game of Russian Roulette against YOURSELF!
I expect to get back to this; I had to take care of something for work and now I’m too tired to do it justice. If I haven’t responded to it within 18 hours, please remind me.
After conferring with Blueberry via PM, we agree that we’ll need to talk in realtime to get much further with this. Our schedules are both fairly busy right now, but we intend to try to turn the discussion into a top post. (I’d also be amenable to making the log public, or letting other people observe or participate, but I haven’t talked to Blue about that.)
I imagine it would be much like a case of amnesia, only with less disorientation.
Edit: Wait, I’m looking at the wrong half. One moment.
Edit: I suppose it would depend on the circumstances—“fear” is an obvious one, although mitigated to an extent by knowing that I would not be leaving a hole behind me (no grieving relatives, etc.).
Depends on how much it cost me to make it, and how much it costs to keep it around. I’m permanently busy, I’m sure I could use a couple of extra hands around the house ;)