Well said, but do we then conclude that we do actually value justice in itself? Or do we conclude that we value justice instrumentally? Yes, evolution designed us to care about justice for subjunctive deterrence reasons, but so what? Evolution designed all of our values instrumentally for all sorts of purposes that we may or may not care about. But that doesn’t mean we have no values. I have no idea how to answer this question, and am at a loss for how to determine whether a perceived value is terminal or instrumental, in general.
My article, “Morality as Parfitian-filtered decision theory?”, was devoted to exactly that question, and my conclusion is that justice—or at least the feeling that drives us to pursuit-of-justice actions—is an instrumental value, even though such actions cause harm to our terminal values. This is because theories that attempt to explain such “self-sacrificial” actions by positing justice (or morality, etc.) as a separate term in the agent’s utility function add complexity without corresponding explanatory power.
I skimmed the article. First, good idea. I would never have thought of that. But I do think there is a flaw. Given evolution, we would expect humans to have fairly complex utility functions and not simple utility functions. The complexity penalty for evolution + simple utility function could actually be higher than that of evolution + complicated utility function, depending on precisely how complex the simple and complicated utility functions are. For example, I assert that the complexity penalty for [evolution + a utility function with only one value (e.g. paper clips or happiness)] is higher than the complexity penalty for [evolution + any reasonable approximation to our current values].
This is only to say that a more complicated utility function for an evolved agent doesn’t necessarily imply a high complexity penalty. You could still be right in this particular case, but I’m not sure without actually being able to evaluate the relevant complexity penalties.
Well said, but do we then conclude that we do actually value justice in itself? Or do we conclude that we value justice instrumentally? Yes, evolution designed us to care about justice for subjunctive deterrence reasons, but so what? Evolution designed all of our values instrumentally for all sorts of purposes that we may or may not care about. But that doesn’t mean we have no values. I have no idea how to answer this question, and am at a loss for how to determine whether a perceived value is terminal or instrumental, in general.
My article, “Morality as Parfitian-filtered decision theory?”, was devoted to exactly that question, and my conclusion is that justice—or at least the feeling that drives us to pursuit-of-justice actions—is an instrumental value, even though such actions cause harm to our terminal values. This is because theories that attempt to explain such “self-sacrificial” actions by positing justice (or morality, etc.) as a separate term in the agent’s utility function add complexity without corresponding explanatory power.
I skimmed the article. First, good idea. I would never have thought of that. But I do think there is a flaw. Given evolution, we would expect humans to have fairly complex utility functions and not simple utility functions. The complexity penalty for evolution + simple utility function could actually be higher than that of evolution + complicated utility function, depending on precisely how complex the simple and complicated utility functions are. For example, I assert that the complexity penalty for [evolution + a utility function with only one value (e.g. paper clips or happiness)] is higher than the complexity penalty for [evolution + any reasonable approximation to our current values].
This is only to say that a more complicated utility function for an evolved agent doesn’t necessarily imply a high complexity penalty. You could still be right in this particular case, but I’m not sure without actually being able to evaluate the relevant complexity penalties.
That’s a good point, and I’ll have to think about it.