His first, and seemingly most compelling, argument for Duplication over Unification is that, assuming an infinite universe, it’s certain (with probability 1) that there is already an identical portion of the universe where you’re torturing the person in front of you. Given Unification, it’s meaningless to distinguish between that portion and this portion, given their physical identicalness, so torturing the person is morally blameless, as you’re not increasing the number of unique observers being tortured.
I’d argue that the torture portion is not identical to the not-torture portion and that the difference is caused by at least one event in the common prior history of both portions of the universe where they diverged. Unification only makes counterfactual worlds real; it does not cause every agent to experience every counterfactual world. Agents are differentiated by the choices they make and agents who perform torture are not the same agents as those who abstain from torture. The difference can be made arbitrarily small, for instance by choosing an agent with a 50% probability of committing torture based on the outcome of a quantum coin flip, but the moral question in that case is why an agent would choose to become 50% likely to commit torture in the first place. Some counterfactual agents will choose to become 50% likely to commit torture, but they will be very different than the agents who are 1% likely to commit torture.
I think you’re interpreting Bostrom slightly wrong. You seem to be reading his argument (or perhaps just my short distillation of it) as arguing that you’re not currently torturing someone, but there’s an identical section of the universe elsewhere where you are torturing someone, so you might as well start torturing now.
As you note, that’s contradictory—if you’re not currently torturing, then your section of the universe must not be identical to the section where the you-copy is torturing.
Instead, assume that you are currently torturing someone. Bostrom’s argument is that you’re not making the universe worse, because there’s a you-copy which is torturing an identical person elsewhere in the universe. At most one of your copies is capable of taking blame for this; the rest are just running the same calculations “a second time”, so to say. (Or at least, that’s what he’s arguing that Unification would say, and using this as a reason to reject it and turn to Duplication, so each copy is morally culpable for causing new suffering.)
I’d argue that the torture portion is not identical to the not-torture portion and that the difference is caused by at least one event in the common prior history of both portions of the universe where they diverged. Unification only makes counterfactual worlds real; it does not cause every agent to experience every counterfactual world. Agents are differentiated by the choices they make and agents who perform torture are not the same agents as those who abstain from torture. The difference can be made arbitrarily small, for instance by choosing an agent with a 50% probability of committing torture based on the outcome of a quantum coin flip, but the moral question in that case is why an agent would choose to become 50% likely to commit torture in the first place. Some counterfactual agents will choose to become 50% likely to commit torture, but they will be very different than the agents who are 1% likely to commit torture.
I think you’re interpreting Bostrom slightly wrong. You seem to be reading his argument (or perhaps just my short distillation of it) as arguing that you’re not currently torturing someone, but there’s an identical section of the universe elsewhere where you are torturing someone, so you might as well start torturing now.
As you note, that’s contradictory—if you’re not currently torturing, then your section of the universe must not be identical to the section where the you-copy is torturing.
Instead, assume that you are currently torturing someone. Bostrom’s argument is that you’re not making the universe worse, because there’s a you-copy which is torturing an identical person elsewhere in the universe. At most one of your copies is capable of taking blame for this; the rest are just running the same calculations “a second time”, so to say. (Or at least, that’s what he’s arguing that Unification would say, and using this as a reason to reject it and turn to Duplication, so each copy is morally culpable for causing new suffering.)