If you buy into fundamentalist interpretations of utility functions then it’s not. If you don’t, then it is—to me there should be some difference in something “meaningful” for there to be difference in preferences, otherwise it’s not a good utility function.
Even with fundamentalist interpretation you get known inconsistencies with probabilities, so it doesn’t save you.
I think that the strongest critique of D is that most people choose things that they later honestly claim were not “what they actually wanted”, i.e. D acts something like a stable utility function D_u with a time and mood dependent error term D_error added to it. It causes many people much suffering that their own actions don’t live up to the standards of what they consider to be their true goals.
Probabilistic inconsistencies in action are probably less of a problem for humans, though not completely absent.
If you buy into fundamentalist interpretations of utility functions then it’s not. If you don’t, then it is—to me there should be some difference in something “meaningful” for there to be difference in preferences, otherwise it’s not a good utility function.
Even with fundamentalist interpretation you get known inconsistencies with probabilities, so it doesn’t save you.
I think that the strongest critique of D is that most people choose things that they later honestly claim were not “what they actually wanted”, i.e. D acts something like a stable utility function D_u with a time and mood dependent error term D_error added to it. It causes many people much suffering that their own actions don’t live up to the standards of what they consider to be their true goals.
Probabilistic inconsistencies in action are probably less of a problem for humans, though not completely absent.