The first paragraph of this comment is a nitpick, but I felt impelled to it: there is no way that 10^14 dust specks is anywhere near enough to equal one torture victim. Maybe if you multiplied it by a googolplex, then by the number of atoms in the universe, you’d be within a few orders of magnitude.
And now for the meaty response.
You’re making the whole case extremely arbitrary and ignoring utility metrics, which I will now attempt to demonstrate.
Eliezer chose the number 3^^^3 so that no calculation of the disutility of the torture could ever match it, even if you have deontological qualms about torture (which most humans do). It simply doesn’t compare. Utilitarianism in the real world doesn’t work on fringe cases because utility can’t actually be measured. But if you could measure it, then you’d always pick the slightly higher value, every single time. In your example,
We breath a huge sigh of relief—you don’t have to torture anybody, because the math worked out in your favor by a vanishingly small fraction! Then Omega suddenly tells you he’s changing the deal—he’s going to be putting dust speck in YOUR eye, as well.
you ignore that part of my utility function that includes selflessness. Sacrificing something that means little to me for sparing intense suffering by someone else leads to positive utility for me, and I’m assuming other people. (This interestingly also invalidates the example you gave earlier where you polled the 3^^^3 people asking what they wanted—you ignored altruism in the calculation).
Your problems with the Torture vs. Dust Specks dilemma all boil down to “Here’s how the decision changes if I change the parameters of the problem!” (and that doesn’t even work in most of your examples).
Here’s the real problem underlying the equation, and invulnerable to nitpicks:
Omega comes to you and says “I will create 3^^^3 units of disutility, or disutility equal or lesser to the destruction of a single galaxy full of sentient life. Which do you choose?”
As has been said before,
I think the answer is obvious.
The first paragraph of this comment is a nitpick, but I felt impelled to it: there is no way that 10^14 dust specks is anywhere near enough to equal one torture victim. Maybe if you multiplied it by a googolplex, then by the number of atoms in the universe, you’d be within a few orders of magnitude.
And now for the meaty response.
You’re making the whole case extremely arbitrary and ignoring utility metrics, which I will now attempt to demonstrate.
Eliezer chose the number 3^^^3 so that no calculation of the disutility of the torture could ever match it, even if you have deontological qualms about torture (which most humans do). It simply doesn’t compare. Utilitarianism in the real world doesn’t work on fringe cases because utility can’t actually be measured. But if you could measure it, then you’d always pick the slightly higher value, every single time. In your example,
you ignore that part of my utility function that includes selflessness. Sacrificing something that means little to me for sparing intense suffering by someone else leads to positive utility for me, and I’m assuming other people. (This interestingly also invalidates the example you gave earlier where you polled the 3^^^3 people asking what they wanted—you ignored altruism in the calculation).
Your problems with the Torture vs. Dust Specks dilemma all boil down to “Here’s how the decision changes if I change the parameters of the problem!” (and that doesn’t even work in most of your examples).
Here’s the real problem underlying the equation, and invulnerable to nitpicks:
Omega comes to you and says “I will create 3^^^3 units of disutility, or disutility equal or lesser to the destruction of a single galaxy full of sentient life. Which do you choose?”
As has been said before, I think the answer is obvious.