To my mind the issue with copies is that it’s copies who remain exactly the same that “don’t matter”, whereas once you’ve got a bunch of copies being tortured, they’re no longer identical copies and so are different people.
Maybe I’m just having trouble with Sleeping Beauty-like problems, but that’s only a subjective issue for decision making (plus I’d rather spend time learning interesting things that won’t require me to bite the bullet of admitting anyone with a suitable sick and twisted mind could Pascal Mug me). Morally, I much prefer 5,000 iterations each of two happy, fulfilled minds than 10,000 of the same one.
Where “Copies” is used isomorphically with “Future versions of you in either MWI or similar realist interpretation of probability theory”, then I would certainly subject some of them to torture only for a very large potential gain and small risk of torture. “I” don’t like torture, and I’d need a pretty damn big reward for that 1/N longshot to justify a (N-1)/N chance or brutal torture or slavery. This is of course assuming I’m at status quo, if I were a slave or Bagram/Laogai detainee I would try to stay rational and avoid fear making me overly risk averse from escape attempts. I haven’t tried to work out my exact beliefs on it, but as said above if I have two options, one saving a life with certainty and the other having a 50% chance of saving two, I’d prefer saving two (assuming they’re isolated ie two guys on a lifeboat).
tl; dr, it’s a terrible idea in that if you only have the moral authority to condemn copies
Ah yes, I meant to type that you only have the moral authority to condemn copies to torture or slavery if they’re actually you, and it’s pretty stupid to risk almost certain torture for a small chance of a moderate benefit
To my mind the issue with copies is that it’s copies who remain exactly the same that “don’t matter”, whereas once you’ve got a bunch of copies being tortured, they’re no longer identical copies and so are different people. Maybe I’m just having trouble with Sleeping Beauty-like problems, but that’s only a subjective issue for decision making (plus I’d rather spend time learning interesting things that won’t require me to bite the bullet of admitting anyone with a suitable sick and twisted mind could Pascal Mug me). Morally, I much prefer 5,000 iterations each of two happy, fulfilled minds than 10,000 of the same one.
Where “Copies” is used isomorphically with “Future versions of you in either MWI or similar realist interpretation of probability theory”, then I would certainly subject some of them to torture only for a very large potential gain and small risk of torture. “I” don’t like torture, and I’d need a pretty damn big reward for that 1/N longshot to justify a (N-1)/N chance or brutal torture or slavery. This is of course assuming I’m at status quo, if I were a slave or Bagram/Laogai detainee I would try to stay rational and avoid fear making me overly risk averse from escape attempts. I haven’t tried to work out my exact beliefs on it, but as said above if I have two options, one saving a life with certainty and the other having a 50% chance of saving two, I’d prefer saving two (assuming they’re isolated ie two guys on a lifeboat).
tl; dr, it’s a terrible idea in that if you only have the moral authority to condemn copies
Is your last sentence missing something? It feels incomplete.
Ah yes, I meant to type that you only have the moral authority to condemn copies to torture or slavery if they’re actually you, and it’s pretty stupid to risk almost certain torture for a small chance of a moderate benefit