Only if you agree to follow EY in consolidating many different utilities in every possible case into one all-encompassing number, something I am yet to be convinced of, but that is beside the point, I suppose.
If you have a preference for some outcomes versus other outcomes, you are effectively assigning a single number to those outcomes. The method of combining these is certainly a viable topic for dispute—I raised that point myself quite recently.
Sure, if you pick something with a guaranteed negative utility and you think that there should be one number to bind them all, I grant your point.
However, this is not how the problem appears to me. A single speck in the eye has such an insignificant utility, there is no way to estimate its effects without knowing a lot more about the problem.
It was quite explicitly made a part of the original formulation of the problem.
Considering the assumptions you are unwilling to make:
tiny utility can be reasonably well estimated, even up to a sign
As I’ve been saying, there quite clearly seem to be things that fall in the realm of “I am confident this is typically a bad thing” and “it runs counter to my intuition that I would prefer torture to this, regardless of how many people it applied to”.
the resulting number is invariably useful for decision making
I addressed this at the top of this post.
zillions of those utilities can be combined into one single number using a monotonic function
these utilities do not interact in any way that would make their combination change sign
I think it’s clear that there must be some means of combining individual preferences into moral judgments, if there is a morality at all. I am not certain that it can be done with the utility numbers alone. I am reasonably certain that it is monotonic—I cannot conceive of a situation where we would prefer some people to be less happy just for the sake of them being less happy. What is needed here is more than just monotonicity, however—it is necessary that it be divergent with fixed utility across infinite people. I raise this point here, and at this point think this is the closest to a reasonable attack on Eliezer’s argument.
On balance, I think Eliezer is likely to be correct; I do not have sufficient worry that I would stake some percent of 3^^^3 utilons on the contrary and would presently pick torture if I was truly confronted with this situation and didn’t have more time to discuss, debate, and analyze. Given that there is insufficient stuff in the universe to make 3^^^3 dust specks, much less the eyes for them to fly into, I am supremely confident that I won’t be confronted with this choice any time soon.
If you have a preference for some outcomes versus other outcomes, you are effectively assigning a single number to those outcomes. The method of combining these is certainly a viable topic for dispute—I raised that point myself quite recently.
It was quite explicitly made a part of the original formulation of the problem.
Considering the assumptions you are unwilling to make:
As I’ve been saying, there quite clearly seem to be things that fall in the realm of “I am confident this is typically a bad thing” and “it runs counter to my intuition that I would prefer torture to this, regardless of how many people it applied to”.
I addressed this at the top of this post.
I think it’s clear that there must be some means of combining individual preferences into moral judgments, if there is a morality at all. I am not certain that it can be done with the utility numbers alone. I am reasonably certain that it is monotonic—I cannot conceive of a situation where we would prefer some people to be less happy just for the sake of them being less happy. What is needed here is more than just monotonicity, however—it is necessary that it be divergent with fixed utility across infinite people. I raise this point here, and at this point think this is the closest to a reasonable attack on Eliezer’s argument.
On balance, I think Eliezer is likely to be correct; I do not have sufficient worry that I would stake some percent of 3^^^3 utilons on the contrary and would presently pick torture if I was truly confronted with this situation and didn’t have more time to discuss, debate, and analyze. Given that there is insufficient stuff in the universe to make 3^^^3 dust specks, much less the eyes for them to fly into, I am supremely confident that I won’t be confronted with this choice any time soon.