an algorithm for determining whether any given action should or should not be undertaken, given some predetermined utility function
That’s not how the term “utilitarianism” is used in philosophy. The utility function has to be agent neutral. So a utility function where your welfare counts 10x as much as everyone else’s wouldn’t be utilitarian.
That’s not how the term “utilitarianism” is used in philosophy. The utility function has to be agent neutral. So a utility function where your welfare counts 10x as much as everyone else’s wouldn’t be utilitarian.