I hold that moral intuitions are nothing but learned prejudices. Historic examples from slavery to the divine right of kings to tortured confessions of witchcraft or Judaism to the subjugation of women to genocide all point to the fallibility of these ‘moral intuitions’. There is absolutely no sense to the claim that its conclusions are to be adopted before those of a reasoned argument.
Act utilitarianism not only requires no desire for alcohol, it requires no desire for anything other than to maximize utility. If the agent likes the taste of steak better than hamburger, then there will be an instance in which he will sacrifice maximum utility for a steak. If he has a strong preference, it will have the same effect as a strong preference for alcohol. If he has an aversion to pain, a desire for sex, a particular interest in the well being of his children, there are instances in which she will sacrifice her desire to maximize utility to obtain fulfillment of any of these other desires.
I hold that a moral commandment to act as an act-utilitarian is no different than a commandment to alter the gravitational constant to a number that maximizes utility, or a commandment to move the Earth to an orbit that would produce a more pleasing climate. If it cannot be done, there is no sense in saying that it ought to be done.
Of course, the definition of my utility function will include a term for steaks, or alcohol, or whatever intrinsic value they help me achieve. Maximizing utility is not, therefore, contradictory to valuing a steak. My desire to maximize utility includes my desire to eat steak (or whatever intrinsic value it helps me attain).
This seems like a real simple mistake, so maybe I am simply misunderstanding him. Anyone who knows his work better care to comment (at least before I have more time to poke around his site some more)?
I didn’t read any more of his site, but just from the excerpt you gave, her [1] point seems to be that if you value total utility, then you will have to deprive yourself to benefit people in general, which people can’t do—they inevitably act as if their own utility carries more weight than that of others.
[1] Hey, if he can use pronouns confusingly and inconsistently, so can we!
-Alonzo Fyfe
Fallible relative to what?
Full context here.
Skimming around his site, it’s interesting, but I think he made a basic mistake
From here:
Of course, the definition of my utility function will include a term for steaks, or alcohol, or whatever intrinsic value they help me achieve. Maximizing utility is not, therefore, contradictory to valuing a steak. My desire to maximize utility includes my desire to eat steak (or whatever intrinsic value it helps me attain).
This seems like a real simple mistake, so maybe I am simply misunderstanding him. Anyone who knows his work better care to comment (at least before I have more time to poke around his site some more)?
Fyfe annoys me sometimes because he continuously ignores my requests to express concepts in mathematical language.
I didn’t read any more of his site, but just from the excerpt you gave, her [1] point seems to be that if you value total utility, then you will have to deprive yourself to benefit people in general, which people can’t do—they inevitably act as if their own utility carries more weight than that of others.
[1] Hey, if he can use pronouns confusingly and inconsistently, so can we!
“Reasoned argument”, it says.
And how does that help if the premises in your “reasoned argument” are arrived at via intuition?