Apparently, we mean different things by “utilitarianism”. I meant moral system whose terminal goal is to maximize pleasure and minimize suffering in the whole world, while you’re talking about agent’s utility function, which may have no regard for pleasure and suffering.
I agree, thought, that it makes sense to try to maximize one’s utility function, but to me it’s just egoism.
Apparently, we mean different things by “utilitarianism”. I meant moral system whose terminal goal is to maximize pleasure and minimize suffering in the whole world, while you’re talking about agent’s utility function, which may have no regard for pleasure and suffering.
I agree, thought, that it makes sense to try to maximize one’s utility function, but to me it’s just egoism.