No. One of those actions, or something different, happens if I take no action. Assuming that neither the one person nor the 3^^^3 people have consented to allow me to harm them, I must choose the course of action by which I harm nobody, and the abstract force harms people.
If you instead offer me the choice where I prevent the harm (and that the 3^^^3+1 people all consent to allow me to do so), then I choose to prevent the torture.
My maximal expected utility is one in which there is a universe in which I have taken zero additional actions without the consent of every other party involved. With that satisfied, I seek to maximize my own happiness. It would make me happier to prevent a significant harm than to prevent an insignificant harm, and both would be preferable to preventing no harm, all other things being equal.
If the people in question consented to the treatment, then the decision is amoral, and I would choose to inflict the insignificant harm.
From a strict utility perspective, if you describe the value the torture as −1, do you describe the value of the speck of dust in one person’s eye as less than −1/(3^^^3)? There is some epsilon for which it is preferable to have harm of epsilon done to any real number of people than to have harm of −1 done to one person. Admitting that does not prohibit you from comparing epsilons, either.
No. One of those actions, or something different, happens if I take no action. Assuming that neither the one person nor the 3^^^3 people have consented to allow me to harm them, I must choose the course of action by which I harm nobody, and the abstract force harms people.
If you instead offer me the choice where I prevent the harm (and that the 3^^^3+1 people all consent to allow me to do so), then I choose to prevent the torture.
My maximal expected utility is one in which there is a universe in which I have taken zero additional actions without the consent of every other party involved. With that satisfied, I seek to maximize my own happiness. It would make me happier to prevent a significant harm than to prevent an insignificant harm, and both would be preferable to preventing no harm, all other things being equal.
If the people in question consented to the treatment, then the decision is amoral, and I would choose to inflict the insignificant harm.
From a strict utility perspective, if you describe the value the torture as −1, do you describe the value of the speck of dust in one person’s eye as less than −1/(3^^^3)? There is some epsilon for which it is preferable to have harm of epsilon done to any real number of people than to have harm of −1 done to one person. Admitting that does not prohibit you from comparing epsilons, either.