Lossiness is itself an optimized for quantity and varies in importance across differing domains with differing payoff structures. Clashes are often the result of two locally valid choices of lossiness function conflicting when attempts are made to propagate them more globally.
Better definitions->loses less of the things that I think are important and more of the things I think are unimportant. People who have faced a different payoff structure will have strenuous objections. Law of large numbers states that you will be able to find people who have faced a completely perverse data set in terms of edge cases and thus have a radically different payoff structure. If there are such people at both ends of a particular distribution then you get that effect no matter which you optimize for.
Monocultures make this worse because in effect it prevents people from taking their ball and going home ie deciding to use alternative functions for assignation of meaning.
Lossiness is itself an optimized for quantity and varies in importance across differing domains with differing payoff structures. Clashes are often the result of two locally valid choices of lossiness function conflicting when attempts are made to propagate them more globally.
Better definitions->loses less of the things that I think are important and more of the things I think are unimportant. People who have faced a different payoff structure will have strenuous objections. Law of large numbers states that you will be able to find people who have faced a completely perverse data set in terms of edge cases and thus have a radically different payoff structure. If there are such people at both ends of a particular distribution then you get that effect no matter which you optimize for.
Monocultures make this worse because in effect it prevents people from taking their ball and going home ie deciding to use alternative functions for assignation of meaning.