Do you consider a mind that has been tortured identical to one that has not? Won’t the torture process add non-trivial differences, to the point where the minds don’t count as identical?
It’s not a binary distinction. If an identical copy was made of one mind and tortured, while the other instance remained untortured, they would start to differentiate into distinct individuals. As rate of divergence would increase with degree of difference in experience, I imagine torture vs non-torture would spark a fairly rapid divergence.
I haven’t had opportunity to commit to reading Bostrom’s paper, but in the little I did read Bostrom thought it was “prima facie implausible and farfetched to maintain that the wrongness of torturing somebody would be somehow ameliorated or annulled if there happens to exist somewhere an exact copy of that person’s resulting brain-state.” That is, it seemed obvious to Bostrom that having two identical copies of a tortured individual must be worse than one instance of a tortured individual (actually twice as bad, if I interpret correctly). That does not at all seem obvious to me, as I would consider two (synchronized) copies to be one individual in two places. The only thing worse about having two copies that occurs to me is a greater risk of divergence, leading to increasingly distinct instances.
Are you asking whether it would be better to create a copy of a mind and torture it rather than not creating a copy and just getting on with the torture? Well, yes. It’s certainly worse than not torturing at all, but it’s not as bad as just torturing one mind. Initially, the individual would half-experience torture. Fairly rapidly later, the single individual will separate into two minds, one being tortured and one not. This is arguably still better from the perspective of the pre-torture mind than the single-mind-single-torture scenario, since at least half the mind’s experiences downstream is not-tortured, vs 100%-torture in other case.
If this doesn’t sound convincing, consider a twist: would you choose to copy and rescue a mind-state from someone about to, say, be painfully sucked into a black hole, or would it be ethically meaningless to create a non-sucked-into-black-hole copy? Granted, it would be best to not have anyone sucked into a black hole, but suppose you had to choose?
Looks to me like Bostrom is trying to make the point that duplication of brain-states, by itself and devoid of other circumstances, is not sufficient to make the act of torture moral, or less harmful.
After reading through the paper, it looks to me like we’ve moved outside of what Bostrom was trying to address, here. If synchronized brains lose individuality, and/or an integration process takes place, leading to a brain-state which has learned from the torture experience but remains unharmed, move the argument outside the realm of what Bostrom was trying to address.
I agree with Bostrom on this point. It looks to me like, if Yorik is dismissing 49 tortured copies as inconsequential, he must also show that there is a process where the knowledge accumulated by each of the 49 copies is synchronized and integrated into the remaining one copy, without causing that one copy (or anyone else, for that matter) any harm. Or, there must be some other assumptions that he is making about the copies that remove the damage caused by copying—copying alone can’t remove responsibility for the killing of the copies.
For the black-hole example, copying the person about to be sucked into the hole is not ethically meaningless. The value of the copy, though, comes from its continued existence. The act of copying does not remove moral consequences from the sucking-in-the-black-hole act. If there is an agent X which pushed the copy into the black hole, that agent is just as responsible for his actions if he doesn’t copy the individual at the last minute, as he would be if he does make a copy.
Can you please point me to Bostrom’s paper? I can’t seem to find the reference.
I’m very curious if the in-context quote is better fleshed out. As it stands here, it looks a lot like it’s affected by anthropomorphic bias (or maybe references a large number of hidden assumptions that I don’t share, around both the meaning of individuality and the odds that intelligences which regularly undergo synchronization can remain similar to ours).
I can imagine a whole space of real-life, many-integrated-synchronized-copies scenarios, where the process of creating a copy and torturing it for kicks would be accepted, commonplace and would not cause any sort of moral distress. To me, there is a point where torture and/or destruction of a synchronized, integrated, identical copy transition into the same moral category as body piercings and tatoos.
Do you consider a mind that has been tortured identical to one that has not? Won’t the torture process add non-trivial differences, to the point where the minds don’t count as identical?
It’s not a binary distinction. If an identical copy was made of one mind and tortured, while the other instance remained untortured, they would start to differentiate into distinct individuals. As rate of divergence would increase with degree of difference in experience, I imagine torture vs non-torture would spark a fairly rapid divergence.
I haven’t had opportunity to commit to reading Bostrom’s paper, but in the little I did read Bostrom thought it was “prima facie implausible and farfetched to maintain that the wrongness of torturing somebody would be somehow ameliorated or annulled if there happens to exist somewhere an exact copy of that person’s resulting brain-state.” That is, it seemed obvious to Bostrom that having two identical copies of a tortured individual must be worse than one instance of a tortured individual (actually twice as bad, if I interpret correctly). That does not at all seem obvious to me, as I would consider two (synchronized) copies to be one individual in two places. The only thing worse about having two copies that occurs to me is a greater risk of divergence, leading to increasingly distinct instances.
Are you asking whether it would be better to create a copy of a mind and torture it rather than not creating a copy and just getting on with the torture? Well, yes. It’s certainly worse than not torturing at all, but it’s not as bad as just torturing one mind. Initially, the individual would half-experience torture. Fairly rapidly later, the single individual will separate into two minds, one being tortured and one not. This is arguably still better from the perspective of the pre-torture mind than the single-mind-single-torture scenario, since at least half the mind’s experiences downstream is not-tortured, vs 100%-torture in other case.
If this doesn’t sound convincing, consider a twist: would you choose to copy and rescue a mind-state from someone about to, say, be painfully sucked into a black hole, or would it be ethically meaningless to create a non-sucked-into-black-hole copy? Granted, it would be best to not have anyone sucked into a black hole, but suppose you had to choose?
Looks to me like Bostrom is trying to make the point that duplication of brain-states, by itself and devoid of other circumstances, is not sufficient to make the act of torture moral, or less harmful.
After reading through the paper, it looks to me like we’ve moved outside of what Bostrom was trying to address, here. If synchronized brains lose individuality, and/or an integration process takes place, leading to a brain-state which has learned from the torture experience but remains unharmed, move the argument outside the realm of what Bostrom was trying to address.
I agree with Bostrom on this point. It looks to me like, if Yorik is dismissing 49 tortured copies as inconsequential, he must also show that there is a process where the knowledge accumulated by each of the 49 copies is synchronized and integrated into the remaining one copy, without causing that one copy (or anyone else, for that matter) any harm. Or, there must be some other assumptions that he is making about the copies that remove the damage caused by copying—copying alone can’t remove responsibility for the killing of the copies.
For the black-hole example, copying the person about to be sucked into the hole is not ethically meaningless. The value of the copy, though, comes from its continued existence. The act of copying does not remove moral consequences from the sucking-in-the-black-hole act. If there is an agent X which pushed the copy into the black hole, that agent is just as responsible for his actions if he doesn’t copy the individual at the last minute, as he would be if he does make a copy.
Can you please point me to Bostrom’s paper? I can’t seem to find the reference.
I’m very curious if the in-context quote is better fleshed out. As it stands here, it looks a lot like it’s affected by anthropomorphic bias (or maybe references a large number of hidden assumptions that I don’t share, around both the meaning of individuality and the odds that intelligences which regularly undergo synchronization can remain similar to ours).
I can imagine a whole space of real-life, many-integrated-synchronized-copies scenarios, where the process of creating a copy and torturing it for kicks would be accepted, commonplace and would not cause any sort of moral distress. To me, there is a point where torture and/or destruction of a synchronized, integrated, identical copy transition into the same moral category as body piercings and tatoos.
Quantity of experience: brain-duplication and degrees of consciousness