If “shut up and multiply’ was always the right answer, if you could break everything down to utilitons and go with whatever provides the most utility for the most people, then the logical thing to do would be to create a trillion intelligent beings in the solar system who do nothing but enjoy pleasure all the time
Yes, that’s the right answer if you find the notion of one intelligent being who does nothing but wallow in pleasure as a net gain and you agree that there aren’t diminishing returns on creating such beings, and this doesn’t divert resources from any more important values.
But if you find the notion of even one such being repulsive or neutral, then you’re “shutting up and multiplying” over a non-positive number. In decision theory, utility quantifies how much you personally prefer certain scenarios, not pleasure.
I think we’re saying the same thing. Pleasure undoubtedly has some positive utility (or else we wouldn’t seek it out). But, like you said, you are diverting resources from “more important values”. That was, in a sense, the whole point of Eliezer’s story about the “Superhappys”.
So, by the same token, if we think “nobody should be tortured” is a more important value then “avoiding small amounts of annoyance”, then we should not sacrifice one for the other, not even for very large values of “avoiding annoyances”.
The only difference is that it’s more obvious when we’re talking about positive values (like pleasure) then when you’re talking about negative values (like avoiding someone being tortured).
Yes, that’s the right answer if you find the notion of one intelligent being who does nothing but wallow in pleasure as a net gain and you agree that there aren’t diminishing returns on creating such beings, and this doesn’t divert resources from any more important values.
But if you find the notion of even one such being repulsive or neutral, then you’re “shutting up and multiplying” over a non-positive number. In decision theory, utility quantifies how much you personally prefer certain scenarios, not pleasure.
I think we’re saying the same thing. Pleasure undoubtedly has some positive utility (or else we wouldn’t seek it out). But, like you said, you are diverting resources from “more important values”. That was, in a sense, the whole point of Eliezer’s story about the “Superhappys”.
So, by the same token, if we think “nobody should be tortured” is a more important value then “avoiding small amounts of annoyance”, then we should not sacrifice one for the other, not even for very large values of “avoiding annoyances”.
The only difference is that it’s more obvious when we’re talking about positive values (like pleasure) then when you’re talking about negative values (like avoiding someone being tortured).