What does that look like with respect to shaping-the-values-of-others? I won’t, here, attempt a remotely complete answer
in very short, if you sub in the “agency of all agents” itself as the “value to be maximized” the repugnancy vanishes from utilitarianism and it gets a lot closer to what it seems like you’re searching/advocating for.
in very short, if you sub in the “agency of all agents” itself as the “value to be maximized” the repugnancy vanishes from utilitarianism and it gets a lot closer to what it seems like you’re searching/advocating for.