To get the effect that we need an optimiser that cares about some people’s opinion more about some things but then for some other things cares about someone else’s opinion. If we just have a utility monster who the optimiser always values more than others we can’t get the effect. The important thing is that it sometimes cares about one person and sometimes cares about someone else.
Isn’t that the description of an utility maximizer (or optimizer) taking into account the preferences of an utility monster?
To get the effect that we need an optimiser that cares about some people’s opinion more about some things but then for some other things cares about someone else’s opinion. If we just have a utility monster who the optimiser always values more than others we can’t get the effect. The important thing is that it sometimes cares about one person and sometimes cares about someone else.