I guess the reason is maximizing one’s utility function, in general. Empathy is just one component of the utility function (for those agents who feel it).
If multiple agents share the same utility function, and they know it, it should make their cooperation easier, because they only have to agree on facts and models of the world; they don’t have to “fight” against each other.
Apparently, we mean different things by “utilitarianism”. I meant moral system whose terminal goal is to maximize pleasure and minimize suffering in the whole world, while you’re talking about agent’s utility function, which may have no regard for pleasure and suffering.
I agree, thought, that it makes sense to try to maximize one’s utility function, but to me it’s just egoism.
I guess the reason is maximizing one’s utility function, in general. Empathy is just one component of the utility function (for those agents who feel it).
If multiple agents share the same utility function, and they know it, it should make their cooperation easier, because they only have to agree on facts and models of the world; they don’t have to “fight” against each other.
Apparently, we mean different things by “utilitarianism”. I meant moral system whose terminal goal is to maximize pleasure and minimize suffering in the whole world, while you’re talking about agent’s utility function, which may have no regard for pleasure and suffering.
I agree, thought, that it makes sense to try to maximize one’s utility function, but to me it’s just egoism.