The “it’s okay to kill copies” thing has never made any sense to me either. The explanation that often accompanies it is “well they won’t remember being tortured”, but that’s the exact same scenario for ALL of us after we die, so why are copies an exception to this?
Would you willingly submit yourself to torture for the benefit of some abstract, “extra” version of you? Really? Make a deal with a friend to pay you $100 for every hour of waterboarding you subject yourself to. See how long this seems like a good idea.
To my mind the issue with copies is that it’s copies who remain exactly the same that “don’t matter”, whereas once you’ve got a bunch of copies being tortured, they’re no longer identical copies and so are different people.
Maybe I’m just having trouble with Sleeping Beauty-like problems, but that’s only a subjective issue for decision making (plus I’d rather spend time learning interesting things that won’t require me to bite the bullet of admitting anyone with a suitable sick and twisted mind could Pascal Mug me). Morally, I much prefer 5,000 iterations each of two happy, fulfilled minds than 10,000 of the same one.
Where “Copies” is used isomorphically with “Future versions of you in either MWI or similar realist interpretation of probability theory”, then I would certainly subject some of them to torture only for a very large potential gain and small risk of torture. “I” don’t like torture, and I’d need a pretty damn big reward for that 1/N longshot to justify a (N-1)/N chance or brutal torture or slavery. This is of course assuming I’m at status quo, if I were a slave or Bagram/Laogai detainee I would try to stay rational and avoid fear making me overly risk averse from escape attempts. I haven’t tried to work out my exact beliefs on it, but as said above if I have two options, one saving a life with certainty and the other having a 50% chance of saving two, I’d prefer saving two (assuming they’re isolated ie two guys on a lifeboat).
tl; dr, it’s a terrible idea in that if you only have the moral authority to condemn copies
Ah yes, I meant to type that you only have the moral authority to condemn copies to torture or slavery if they’re actually you, and it’s pretty stupid to risk almost certain torture for a small chance of a moderate benefit
People break under torture, so I’d take precautions to ensure that the torture-copy is not allowed to make decisions about whether it should continue. Of course I’m going to regret it. That doesn’t change the fact that it’s a good idea.
Why is this a good idea in any way other than the general position that “torturing other people for your own profit is a good idea so long as you don’t care about people?” Most of human history is based around the many being exploited for the benefit of the few. Why is this different?
I suppose people should have the right to willingly submit to torture for some small benefit to another person, which is what you’re saying you’d be willing to do. But the fact that a copy gets erased doesn’t make the experience any less real, and the fact that an identical copy gets to live doesn’t in any way help the copies that were being tortured.
It’s different because (1) I’m not hurting other people, only myself, and (2) I’m not depriving the world of my victim’s potential contributions as a free person.
I don’t actually care about the avoidance of torture as a terminal moral value.
But after the fork, your copy will quickly become another person, won’t he? After all, he’s being tortured and you’re not, and he is probably very angry at you for making this decision.
So I guess the question is: If I donate $1 to charity for every hour you get waterboarded, and make provisions to balance out the contributions you would have made as a free person, would you do it?
In thought experiment land… maybe. I’d have to think carefully about what value I place on myself as a special case. In practice, I don’t believe that you can fully compensate for all of the unknown accomplishments I might have made to society.
Hurting others is ethically problematic, not morally. For example, I would probably be okay with hurting someone else at their own request. Avoidance of torture is a question of an entirely different type: what I value, not how I think it’s appropriate to go about getting it.
I don’t have a formalization of my terminal values, but roughly:
I have noticed that sometimes I feel more conscious than other times—not just awake/dreaming/sleeping, but between different “awake” times. I infer that consciousness/sentience/sapience/personhood/whatever you want to call it, you know, that thing we care about is not a binary predicate, but a scalar. I want to maximize the degree of personhood that exists in the universe.
By morals, I mean terminal values. By ethics, I mean advanced forms of strategy involving things like Hofstadter’s superrationality. I’m not sure what the standard LW jargon is for this sort of thing, but I think I remember reading something about deciding as though you were deciding on behalf of everyone who shares your decision theory.
I want to maximize the degree of personhood that exists in the universe.
So, if you create a person, and torture them for their entire life, that’s worth it?
If the most conscious person possible would be unhappy, I’d rather create them than not. The consensus among science fiction writers seems to be with me on this: a drug that makes you happy at the expense of your creative genius is generally treated as a bad thing.
By ethics, I mean advanced forms of strategy involving things like Hofstadter’s superrationality. I’m not sure what the standard LW jargon is for this sort of thing
Do you mean to equate here the degree to which something is a person, the degree to which a person is conscious, and the degree to which a person is a creative genius?
That’s what it reads like, but perhaps I’m reading too much into your comment.
The “it’s okay to kill copies” thing has never made any sense to me either. The explanation that often accompanies it is “well they won’t remember being tortured”, but that’s the exact same scenario for ALL of us after we die, so why are copies an exception to this?
Would you willingly submit yourself to torture for the benefit of some abstract, “extra” version of you? Really? Make a deal with a friend to pay you $100 for every hour of waterboarding you subject yourself to. See how long this seems like a good idea.
To my mind the issue with copies is that it’s copies who remain exactly the same that “don’t matter”, whereas once you’ve got a bunch of copies being tortured, they’re no longer identical copies and so are different people. Maybe I’m just having trouble with Sleeping Beauty-like problems, but that’s only a subjective issue for decision making (plus I’d rather spend time learning interesting things that won’t require me to bite the bullet of admitting anyone with a suitable sick and twisted mind could Pascal Mug me). Morally, I much prefer 5,000 iterations each of two happy, fulfilled minds than 10,000 of the same one.
Where “Copies” is used isomorphically with “Future versions of you in either MWI or similar realist interpretation of probability theory”, then I would certainly subject some of them to torture only for a very large potential gain and small risk of torture. “I” don’t like torture, and I’d need a pretty damn big reward for that 1/N longshot to justify a (N-1)/N chance or brutal torture or slavery. This is of course assuming I’m at status quo, if I were a slave or Bagram/Laogai detainee I would try to stay rational and avoid fear making me overly risk averse from escape attempts. I haven’t tried to work out my exact beliefs on it, but as said above if I have two options, one saving a life with certainty and the other having a 50% chance of saving two, I’d prefer saving two (assuming they’re isolated ie two guys on a lifeboat).
tl; dr, it’s a terrible idea in that if you only have the moral authority to condemn copies
Is your last sentence missing something? It feels incomplete.
Ah yes, I meant to type that you only have the moral authority to condemn copies to torture or slavery if they’re actually you, and it’s pretty stupid to risk almost certain torture for a small chance of a moderate benefit
People break under torture, so I’d take precautions to ensure that the torture-copy is not allowed to make decisions about whether it should continue. Of course I’m going to regret it. That doesn’t change the fact that it’s a good idea.
Why is this a good idea in any way other than the general position that “torturing other people for your own profit is a good idea so long as you don’t care about people?” Most of human history is based around the many being exploited for the benefit of the few. Why is this different?
I suppose people should have the right to willingly submit to torture for some small benefit to another person, which is what you’re saying you’d be willing to do. But the fact that a copy gets erased doesn’t make the experience any less real, and the fact that an identical copy gets to live doesn’t in any way help the copies that were being tortured.
It’s different because (1) I’m not hurting other people, only myself, and (2) I’m not depriving the world of my victim’s potential contributions as a free person.
I don’t actually care about the avoidance of torture as a terminal moral value.
But after the fork, your copy will quickly become another person, won’t he? After all, he’s being tortured and you’re not, and he is probably very angry at you for making this decision. So I guess the question is: If I donate $1 to charity for every hour you get waterboarded, and make provisions to balance out the contributions you would have made as a free person, would you do it?
In thought experiment land… maybe. I’d have to think carefully about what value I place on myself as a special case. In practice, I don’t believe that you can fully compensate for all of the unknown accomplishments I might have made to society.
Pavitra is a he? I must have guessed wrong.
It’s complicated.
What are your terminal moral values?
Also, why is hurting yourself different from hurting other people? And why is not hurting others a moral value, but not avoidance of torture?
Hurting others is ethically problematic, not morally. For example, I would probably be okay with hurting someone else at their own request. Avoidance of torture is a question of an entirely different type: what I value, not how I think it’s appropriate to go about getting it.
I don’t have a formalization of my terminal values, but roughly:
I have noticed that sometimes I feel more conscious than other times—not just awake/dreaming/sleeping, but between different “awake” times. I infer that consciousness/sentience/sapience/personhood/whatever you want to call it, you know, that thing we care about is not a binary predicate, but a scalar. I want to maximize the degree of personhood that exists in the universe.
What’s the difference between ethics and morals?
So, if you create a person, and torture them for their entire life, that’s worth it?
By morals, I mean terminal values. By ethics, I mean advanced forms of strategy involving things like Hofstadter’s superrationality. I’m not sure what the standard LW jargon is for this sort of thing, but I think I remember reading something about deciding as though you were deciding on behalf of everyone who shares your decision theory.
If the most conscious person possible would be unhappy, I’d rather create them than not. The consensus among science fiction writers seems to be with me on this: a drug that makes you happy at the expense of your creative genius is generally treated as a bad thing.
Sounds like decision theory.
That link was what I needed. By ethics I mean, roughly, the difference between causal decision theory and the right answer.
Do you mean to equate here the degree to which something is a person, the degree to which a person is conscious, and the degree to which a person is a creative genius?
That’s what it reads like, but perhaps I’m reading too much into your comment.
That seems unjustified to me.
I don’t mean to equate them. They’re each a rough approximation to the thing I actually care about.