If I say “I prefer not to be tortured more than you prefer a popsicle”, any sane human would agree. This is the commonsense way in which utility can be compared between humans. Of course, it isn’t perfect, but we could easily imagine ways to make it better, say by running some regression algorithms on brain-scans of humans desiring popsicles and humans desiring not-to-be-tortured, and extrapolating to other human minds. (That would still be imperfect, but we can make it arbitrarily good.)
This isn’t just necessary if you’re a utilitarian, it’s necessary if your moral system in any way involves tradeoffs between humans’ preferences, i.e. it’s necessary for pretty much every human who’s ever lived.
So you are a hedonic utilitarian? You think that morality can be reduced to intensity of desire? I already pointed out that human preferences do not reduce to intensity of desire.
I’m not any sort of utilitarian, and that has nothing to do with my point, which is that there obviously is a sense in which I can prefer A more than you prefer B.
that’s more like being conditional that we cooperate. If my enemy would say that I could find it offensive and it doesn’ty compel me to change my actions. If you try to use utlitarian theory to (en)force a cooperation the argument doesn’t go throught.
If I say “I prefer not to be tortured more than you prefer a popsicle”, any sane human would agree. This is the commonsense way in which utility can be compared between humans. Of course, it isn’t perfect, but we could easily imagine ways to make it better, say by running some regression algorithms on brain-scans of humans desiring popsicles and humans desiring not-to-be-tortured, and extrapolating to other human minds. (That would still be imperfect, but we can make it arbitrarily good.)
This isn’t just necessary if you’re a utilitarian, it’s necessary if your moral system in any way involves tradeoffs between humans’ preferences, i.e. it’s necessary for pretty much every human who’s ever lived.
So you are a hedonic utilitarian? You think that morality can be reduced to intensity of desire? I already pointed out that human preferences do not reduce to intensity of desire.
I’m not any sort of utilitarian, and that has nothing to do with my point, which is that there obviously is a sense in which I can prefer A more than you prefer B.
that’s more like being conditional that we cooperate. If my enemy would say that I could find it offensive and it doesn’ty compel me to change my actions. If you try to use utlitarian theory to (en)force a cooperation the argument doesn’t go throught.