The more I think about this, the more I want to make a bunch of clones of myself. I don’t even see why I’d need to destroy them. I shouldn’t have to pay for them; they can get their own jobs, so wealth isn’t that much of a concern.
Coming up with a plan that’s acceptable to both versions of yourself before going through with the cloning should be about as easy as coming up with a plan that’s acceptable to just one version, once you’re using the right kind of framework to think about it.
The concern is that immediately after you clone, both copies agree that Copy 1 should live and Copy 2 should die, but afterwards, Copy 2 doesn’t want to lose those experiences. If you decide beforehand that you only want one of you around, and Copy 2 is created specifically to be destroyed, there should be a way to bind Copy 2 to suicide.
Calling it murder seems extreme, since you end up surviving. What’s the difference between binding a copy to suicide and binding yourself to take a sleep-amnesia pill?
If it’s not utterly voluntary when committed, I don’t class it as suicide. (I also consider ‘driving someone to suicide’ to actually be murder.)
My solution to resolving the ethical dilemma is, to reword it, to give the clone full human rights from the moment it’s created (actually a slightly expanded version of current human rights, since we’re currently prohibited from suiciding). I assume that it’s not currently possible to enforce a contract that will directly cause one party’s death; that aspect of inter-human interaction should remain. The wealth-split serves as a balance in two ways: Suddenly having your wealth halved would be traumatic for almost anyone, which gives a clone that had planned to suicide extra impetus to do so, and also should strongly discourage people from taking unnecessary risks when making clones. In other words, that’s not a bug, it’s a feature.
The difference between what you proposed and the sleeping pill scenario is that in the latter, there’s never a situation where an individual is deprived of rights.
If it’s not utterly voluntary when committed, I don’t class it as suicide.
I’m still unclear why you classify it as death at all. You end up surviving it.
I think you’re thinking of a each copy as an individual. I’m thinking of the copies collectively as a tool used by an individual.
The difference between what you proposed and the sleeping pill scenario is that in the latter, there’s never a situation where an individual is deprived of rights.
Ok, say you enter into a binding agreement forcing yourself to take a sleeping pill tomorrow. You have someone there to enforce it if necessary. The next day, you change your mind, and the person forces you to take the pill anyway. Have you been deprived of rights? (If it helps, substitute eating dessert, or gambling, or doing heroin for taking the pill.)
I think you’re thinking of a each copy as an individual. I’m thinking of the copies collectively as a tool used by an individual.
Yes, I am, and as far as I can tell mine’s the accurate model. Each copy is separately alive and conscious; they should no more be treated as the same individual than twins are treated as the same individual. (Otherwise, why is there any ethical question at all?)
Ok, say you enter into a binding agreement forcing yourself to take a sleeping pill tomorrow. … Have you been deprived of rights?
This kind of question comes up every so often here, and I still haven’t heard or thought of an answer that satisfies me. I don’t see it as relevant here, though, because I do recognize the clone as a separate individual who shouldn’t be coerced.
Yes, I am, and as far as I can tell mine’s the accurate model.
But if my copies and I don’t think that way, is it still accurate for us? We agree to be bound by any original agreement, and we think any of us are still alive as long as one of us is, so there’s no death involved. Well, death of a living organism, but not death of a person.
I don’t see it as relevant here, though, because I do recognize the clone as a separate individual who shouldn’t be coerced.
It’s the same question, because I’m assuming both copy A and copy B agree to be bound by the agreement immediately after copying (which is the same as the original making a plan immediately before copying). Both copies share a past, so if you can be bound by your past agreements, so can each copy. Even if the copies are separate individuals, they don’t have separate pasts.
If you and all your copies think that way, then you shouldn’t have to worry about them defecting in the first place, and the rule is irrelevant for you. How sure are you that that’s what you really believe, though? Sure enough to bet 1⁄2 your wealth?
My concern with having specific copies be bound to past agreements is that I don’t trust that people won’t abuse that: It’s easy not to see the clone as ‘yourself’, but as an easily exploitable other. Here’s a possible solution to that problem (though one that I don’t like as well as not having the clone bound by prior agreements at all): Clones can only be bound by prior agreements that randomly determine which one acts as the ‘new’ clone and which acts as the ‘old’ clone. So, if you split off a clone to go review a movie for you, and pre-bind the clone to die after reporting back, there’s a 50% chance—determined by a coin flip—that it’s you, the original, who will review the movie, and the clone who will continue with your life.
There isn’t an “original”. After the copying, there’s Copy A and Copy B. Both are me. I’m fine with randomly selecting whether Copy A or Copy B goes to see the movie, but it doesn’t matter, since they’re identical (until one sees the movie). In fact, there is no way to not randomly select which copy sees the movie.
From the point of view of the clone who sees the movie (say it’s bad), “suiciding” is the same as him going back in time and not seeing the movie. So I’d always stick to a prior agreement in a case like that.
If you and all your copies think that way, then you shouldn’t have to worry about them defecting in the first place, and the rule is irrelevant for you. How sure are you that that’s what you really believe, though? Sure enough to bet 1⁄2 your wealth?
I don’t really have any wealth to speak of. But they’re all me. If I won’t defect, then they won’t. The question is just whether or not we might disagree on what’s best for me. In which case, we can either go by prior agreement, or just let them all live. If the other mes really wanted to live, I’d let them. For instance, say I made 5 copies and all 5 of us went out to try different approaches to a career, agreeing the best one would survive. If a year later more than one claimed to have the best result for Blueberry, I might as well let more than one live.
ETA: However, there might be situations where I can only have one copy survive. For instance, I’m in a grad program now that I’d like to finish, and more than one of me can’t be enrolled for administrative reasons. So if I really need only one of me, I guess we could decide randomly which one would survive. I’m all right with forcing a copy to suicide if he changes his mind, since I’m making that decision for all the clones ahead of time to lead to the best outcome for Blueberry.
If one of the clones developed enough individuality to change his mind and disagree with the others, I definitely don’t see how you could consider that one anything other than an individual.
Likewise, if all of the clones decided to change their minds and go their separate ways, that would be functionally the same as you-as-a-single-person-with-a-single-body changing your mind about something, and the general rule there is that humans are allowed to do that, without being interfered with. I don’t see any reason to change that rule.
Be careful of generalizing from one example. I’m relatively certain that the vast majority of people who might consider cloning themselves wouldn’t see it the way you do, and would in fact need significant safeguards to protect the version of themselves who remembers waking up in a lab from being abused by the version of themselves who remembers going home after having their DNA sampled and their brain scanned.
I did have people like you in mind, at least peripherally, in my original suggestion, though: I’m fairly sure that the original proposal doesn’t take away any rights that you already have. (To the best of my knowledge, it is illegal for someone to force you to take a sleeping pill, even if you previously agreed to it, and my knowledge there is a bit better than average; remember that I worked at a nursing home.)
I’m relatively certain that the vast majority of people who might consider cloning themselves wouldn’t see it the way you do, and would in fact need significant safeguards to protect the version of themselves who remembers waking up in a lab from being abused by the version of themselves who remembers going home after having their DNA sampled and their brain scanned.
I’d like to hear more about this. First, I was imagining an identical atom-for-atom duplicate being constructed, in such a way that there is no fact of the matter who’s the original. As in, you press a button and there are two of you. I wasn’t thinking about an organism grown in a lab. But I’m not sure that matters, except that the lab scenario makes it easier to think of one copy being in control of the other copy.
You think the majority of people would worry about, and would need to worry about, one copy abusing the other copy? Why? The copies would have to fight for control first, which should be an even fight. And what would the point be?
I’m fairly sure that the original proposal doesn’t take away any rights that you already have. To the best of my knowledge, it is illegal for someone to force you to take a sleeping pill, even if you previously agreed to it.
Yes, that’s illegal except maybe in an emergency psychiatric situation. Here’s an idea: a time-delayed suicide pill, with no antidote, that one of the copies can take immediately after the cloning. That’s equivalent to having the agreement enforced, but it doesn’t take away any rights either. I think that addresses your concern.
I expect to get back to this; I had to take care of something for work and now I’m too tired to do it justice. If I haven’t responded to it within 18 hours, please remind me.
After conferring with Blueberry via PM, we agree that we’ll need to talk in realtime to get much further with this. Our schedules are both fairly busy right now, but we intend to try to turn the discussion into a top post. (I’d also be amenable to making the log public, or letting other people observe or participate, but I haven’t talked to Blue about that.)
The more I think about this, the more I want to make a bunch of clones of myself. I don’t even see why I’d need to destroy them. I shouldn’t have to pay for them; they can get their own jobs, so wealth isn’t that much of a concern.
The concern is that immediately after you clone, both copies agree that Copy 1 should live and Copy 2 should die, but afterwards, Copy 2 doesn’t want to lose those experiences. If you decide beforehand that you only want one of you around, and Copy 2 is created specifically to be destroyed, there should be a way to bind Copy 2 to suicide.
Disagree. I would class that as murder, not suicide, and consider creating a clone who would be subject to such binding to be unethical.
Calling it murder seems extreme, since you end up surviving. What’s the difference between binding a copy to suicide and binding yourself to take a sleep-amnesia pill?
If it’s not utterly voluntary when committed, I don’t class it as suicide. (I also consider ‘driving someone to suicide’ to actually be murder.)
My solution to resolving the ethical dilemma is, to reword it, to give the clone full human rights from the moment it’s created (actually a slightly expanded version of current human rights, since we’re currently prohibited from suiciding). I assume that it’s not currently possible to enforce a contract that will directly cause one party’s death; that aspect of inter-human interaction should remain. The wealth-split serves as a balance in two ways: Suddenly having your wealth halved would be traumatic for almost anyone, which gives a clone that had planned to suicide extra impetus to do so, and also should strongly discourage people from taking unnecessary risks when making clones. In other words, that’s not a bug, it’s a feature.
The difference between what you proposed and the sleeping pill scenario is that in the latter, there’s never a situation where an individual is deprived of rights.
I’m still unclear why you classify it as death at all. You end up surviving it.
I think you’re thinking of a each copy as an individual. I’m thinking of the copies collectively as a tool used by an individual.
Ok, say you enter into a binding agreement forcing yourself to take a sleeping pill tomorrow. You have someone there to enforce it if necessary. The next day, you change your mind, and the person forces you to take the pill anyway. Have you been deprived of rights? (If it helps, substitute eating dessert, or gambling, or doing heroin for taking the pill.)
I don’t think any such agreement could be legally binding under current law, which is relevant since we’re talking about rights.
Yes, I am, and as far as I can tell mine’s the accurate model. Each copy is separately alive and conscious; they should no more be treated as the same individual than twins are treated as the same individual. (Otherwise, why is there any ethical question at all?)
This kind of question comes up every so often here, and I still haven’t heard or thought of an answer that satisfies me. I don’t see it as relevant here, though, because I do recognize the clone as a separate individual who shouldn’t be coerced.
But if my copies and I don’t think that way, is it still accurate for us? We agree to be bound by any original agreement, and we think any of us are still alive as long as one of us is, so there’s no death involved. Well, death of a living organism, but not death of a person.
It’s the same question, because I’m assuming both copy A and copy B agree to be bound by the agreement immediately after copying (which is the same as the original making a plan immediately before copying). Both copies share a past, so if you can be bound by your past agreements, so can each copy. Even if the copies are separate individuals, they don’t have separate pasts.
If you and all your copies think that way, then you shouldn’t have to worry about them defecting in the first place, and the rule is irrelevant for you. How sure are you that that’s what you really believe, though? Sure enough to bet 1⁄2 your wealth?
My concern with having specific copies be bound to past agreements is that I don’t trust that people won’t abuse that: It’s easy not to see the clone as ‘yourself’, but as an easily exploitable other. Here’s a possible solution to that problem (though one that I don’t like as well as not having the clone bound by prior agreements at all): Clones can only be bound by prior agreements that randomly determine which one acts as the ‘new’ clone and which acts as the ‘old’ clone. So, if you split off a clone to go review a movie for you, and pre-bind the clone to die after reporting back, there’s a 50% chance—determined by a coin flip—that it’s you, the original, who will review the movie, and the clone who will continue with your life.
There isn’t an “original”. After the copying, there’s Copy A and Copy B. Both are me. I’m fine with randomly selecting whether Copy A or Copy B goes to see the movie, but it doesn’t matter, since they’re identical (until one sees the movie). In fact, there is no way to not randomly select which copy sees the movie.
From the point of view of the clone who sees the movie (say it’s bad), “suiciding” is the same as him going back in time and not seeing the movie. So I’d always stick to a prior agreement in a case like that.
I don’t really have any wealth to speak of. But they’re all me. If I won’t defect, then they won’t. The question is just whether or not we might disagree on what’s best for me. In which case, we can either go by prior agreement, or just let them all live. If the other mes really wanted to live, I’d let them. For instance, say I made 5 copies and all 5 of us went out to try different approaches to a career, agreeing the best one would survive. If a year later more than one claimed to have the best result for Blueberry, I might as well let more than one live.
ETA: However, there might be situations where I can only have one copy survive. For instance, I’m in a grad program now that I’d like to finish, and more than one of me can’t be enrolled for administrative reasons. So if I really need only one of me, I guess we could decide randomly which one would survive. I’m all right with forcing a copy to suicide if he changes his mind, since I’m making that decision for all the clones ahead of time to lead to the best outcome for Blueberry.
Response to ETA:
If one of the clones developed enough individuality to change his mind and disagree with the others, I definitely don’t see how you could consider that one anything other than an individual.
Likewise, if all of the clones decided to change their minds and go their separate ways, that would be functionally the same as you-as-a-single-person-with-a-single-body changing your mind about something, and the general rule there is that humans are allowed to do that, without being interfered with. I don’t see any reason to change that rule.
Be careful of generalizing from one example. I’m relatively certain that the vast majority of people who might consider cloning themselves wouldn’t see it the way you do, and would in fact need significant safeguards to protect the version of themselves who remembers waking up in a lab from being abused by the version of themselves who remembers going home after having their DNA sampled and their brain scanned.
I did have people like you in mind, at least peripherally, in my original suggestion, though: I’m fairly sure that the original proposal doesn’t take away any rights that you already have. (To the best of my knowledge, it is illegal for someone to force you to take a sleeping pill, even if you previously agreed to it, and my knowledge there is a bit better than average; remember that I worked at a nursing home.)
I’d like to hear more about this. First, I was imagining an identical atom-for-atom duplicate being constructed, in such a way that there is no fact of the matter who’s the original. As in, you press a button and there are two of you. I wasn’t thinking about an organism grown in a lab. But I’m not sure that matters, except that the lab scenario makes it easier to think of one copy being in control of the other copy.
You think the majority of people would worry about, and would need to worry about, one copy abusing the other copy? Why? The copies would have to fight for control first, which should be an even fight. And what would the point be?
Yes, that’s illegal except maybe in an emergency psychiatric situation. Here’s an idea: a time-delayed suicide pill, with no antidote, that one of the copies can take immediately after the cloning. That’s equivalent to having the agreement enforced, but it doesn’t take away any rights either. I think that addresses your concern.
Next up: a game of Russian Roulette against YOURSELF!
I expect to get back to this; I had to take care of something for work and now I’m too tired to do it justice. If I haven’t responded to it within 18 hours, please remind me.
After conferring with Blueberry via PM, we agree that we’ll need to talk in realtime to get much further with this. Our schedules are both fairly busy right now, but we intend to try to turn the discussion into a top post. (I’d also be amenable to making the log public, or letting other people observe or participate, but I haven’t talked to Blue about that.)