The sad thing about this is policy is having to multiply the amount of suffering experienced by the punishment.
There’s a missing step in this result. Moral culpability is about judgement and condemnation of actions (and the actors who performed them), not (necessarily) about punishment. Calculation of optimal punishment is about influencing FUTURE actions, not about judging past actions. It’s not fully disconnected from past culpability, but it’s not at all the same thing.
You may have to increase total suffering, but you may not—perhaps punishing one clone randomly is sufficient to achieve the punishment goals (deterring future bad actions by that decision-maker and by observers). Even if there’s more summed punishment needed to have the same level of deterrence, presumably the clones increased total joy as well, and the net moments of lives-worth-living is somewhat increased.
Now if the cloning ITSELF is a moral wrong (say, it uses resources in a way that causes unjustified harm to others), you pretty much have to overreact—make it far more painful to all the clones, and more painful for more clones. But I’d argue that the culpability for the punishment pain falls on the clones as well, rather than the judge or hangman.
There’s a missing step in this result. Moral culpability is about judgement and condemnation of actions (and the actors who performed them), not (necessarily) about punishment. Calculation of optimal punishment is about influencing FUTURE actions, not about judging past actions. It’s not fully disconnected from past culpability, but it’s not at all the same thing.
You may have to increase total suffering, but you may not—perhaps punishing one clone randomly is sufficient to achieve the punishment goals (deterring future bad actions by that decision-maker and by observers). Even if there’s more summed punishment needed to have the same level of deterrence, presumably the clones increased total joy as well, and the net moments of lives-worth-living is somewhat increased.
Now if the cloning ITSELF is a moral wrong (say, it uses resources in a way that causes unjustified harm to others), you pretty much have to overreact—make it far more painful to all the clones, and more painful for more clones. But I’d argue that the culpability for the punishment pain falls on the clones as well, rather than the judge or hangman.