“Contradictory” could also be called “decomposable”. I.e. my utility for food can be decomposed into a balancing act between a sub-utility that likes to eat and a sub-utility that wants to be thin. I’d say this sort of decomposition is the opposite of messiness.
“Changeable” could be called “context-dependent”. I.e. when your situation changes, so do your preferences. This too doesn’t suggest any sort of “messiness” to me.
The issue of “underdefined” and “manipulable” is more complex, but I don’t believe that “messy values” is the right diagnosis. I propose that a big factor here is a human inability to predict and evaluate consequences. If I could move between the alternate reality where I chose A, and the alternate reality where I chose B, live in each for some time, my preference of A>B would become much more certain and less susceptible to suggestion and manipulation. The apparent messiness of the values comes from the usual messiness of real computations and the strategies for coping with our limited computational power, made worse by the habit of reporting deterministic choices instead of probability distributions.
Changing values in response to experience is also reasonable. E.g. I used to value food more than my weight, but then I got fat, experienced health problems, and decided that I should value weight more. This is not some human nonsense, this is what a perfect Bayesian would do.
Social pressure surely falls under “underdefined and manipulable”, so I don’t have anything to add to my previous comment.
“Contradictory” could also be called “decomposable”. I.e. my utility for food can be decomposed into a balancing act between a sub-utility that likes to eat and a sub-utility that wants to be thin. I’d say this sort of decomposition is the opposite of messiness.
“Changeable” could be called “context-dependent”. I.e. when your situation changes, so do your preferences. This too doesn’t suggest any sort of “messiness” to me.
The issue of “underdefined” and “manipulable” is more complex, but I don’t believe that “messy values” is the right diagnosis. I propose that a big factor here is a human inability to predict and evaluate consequences. If I could move between the alternate reality where I chose A, and the alternate reality where I chose B, live in each for some time, my preference of A>B would become much more certain and less susceptible to suggestion and manipulation. The apparent messiness of the values comes from the usual messiness of real computations and the strategies for coping with our limited computational power, made worse by the habit of reporting deterministic choices instead of probability distributions.
Changeable is more than context dependent—it’s how people can change their values, in response to learnt experience or social pressure.
Changing values in response to experience is also reasonable. E.g. I used to value food more than my weight, but then I got fat, experienced health problems, and decided that I should value weight more. This is not some human nonsense, this is what a perfect Bayesian would do.
Social pressure surely falls under “underdefined and manipulable”, so I don’t have anything to add to my previous comment.