They could just insist on a normalization scheme that is blatantly biased in favor of their utility function. In a theoretical sense, this doesn’t cause a problem, since there is no objective way to define an unbiased normalization anyway. (of course, if everyone insisted on biasing the normalization in their favor, there would be a problem)
I think most of those involved realise that such projects tend to be team efforts—and therefore some compromises over values will be necessary. Anyway, I think this is the main difficulty for utilitarians: most people are not remotely like utilitarians—and so don’t buy into their bizarre ideas about what the future should be like.
They could just insist on a normalization scheme that is blatantly biased in favor of their utility function. In a theoretical sense, this doesn’t cause a problem, since there is no objective way to define an unbiased normalization anyway. (of course, if everyone insisted on biasing the normalization in their favor, there would be a problem)
I think most of those involved realise that such projects tend to be team efforts—and therefore some compromises over values will be necessary. Anyway, I think this is the main difficulty for utilitarians: most people are not remotely like utilitarians—and so don’t buy into their bizarre ideas about what the future should be like.