Personally, I experience more or less the same internal struggle with this question as with the other: I endorse the idea that what matters is total utility, but my moral intuitions aren’t entirely aligned with that idea, so I keep wanting to choose the individual benefit (joy or non-torture) despite being unable to justify choosing it.
Also, as David Gerard says, it’s a different function… that is, you can’t derive an answer to one question from an answer to the other… but the numbers we’re tossing around are so huge that the difference hardly matters.
I don’t think it’s silly at all.
Personally, I experience more or less the same internal struggle with this question as with the other: I endorse the idea that what matters is total utility, but my moral intuitions aren’t entirely aligned with that idea, so I keep wanting to choose the individual benefit (joy or non-torture) despite being unable to justify choosing it.
Also, as David Gerard says, it’s a different function… that is, you can’t derive an answer to one question from an answer to the other… but the numbers we’re tossing around are so huge that the difference hardly matters.