If “parameter” were “instrumental value”, that would mean you could set the parameter according to your values. That would be very similar to saying it was part of your values.
The notion of “value” vs. “instrumental value” is probably bankrupt. It’s very similar to the dichotomy of “grounded symbol vs. dependent symbol”. I talked about this in the section “Value is a network problem” of my regrettably long post on values.
In this particular case, I’m also presenting the option that “parameter” is something you can place logical constraints on, regardless of values. Like math: You can’t say that believing that 1+1=3 is one of your values.
Answer to Wei Dai: It’s a parameter you need to set to implement your values; but I don’t know how you set it. That’s the problem I’m pointing out. Resolving whether this was an instrumental value or not would solve the problem.
Answer to Wei Dai: It’s a parameter you need to set to implement your values; but I don’t know how you set it. That’s the problem I’m pointing out. Resolving whether this was an instrumental value or not would solve the problem.
If I understand you correctly, you’re thinking of utility functions as f(p,U), were U is the state of the universe and p is some parameter. Here f is your values and p is the parameter. If that’s the case, I don’t understand the distinction your making by separating out p, i.e., why not just work with F(U)=f(p,U). If there’s uncertainty about the value of p, how is that different from uncertainty about your utility function?
That is the question posed by the original post. There is a long history of assuming that it makes sense to talk about using the same utility function with different parameters.
If “parameter” were “instrumental value”, that would mean you could set the parameter according to your values. That would be very similar to saying it was part of your values. The notion of “value” vs. “instrumental value” is probably bankrupt.
Well, instrumental value is a totally standard concept:
I think what you are saying is that because instrumental values arise out of intrinsic values we need not bother too much in distinguishing between them.
However, that isn’t right—there’s at least one important time to know what is an instrumental value and what is an intrinsic value—namely when self-improving.
No, I really think instrumental vs. intrinsic values is a bankrupt set of ideas, despite being standard. Our values were not generated by starting with a set of intrinsic values, then adding those instrumental values needed to achieve them. Similarly, I don’t think that we learn language by learning a set of fundamental words, and then defining all other words in terms of them. The link to my regrettably long post on values above is supposed to elucidate this (though I’m not sure it’s any clearer than what I just said).
No, I really think instrumental vs. intrinsic values is a bankrupt set of ideas, despite being standard. Our values were not generated by starting with a set of intrinsic values, then adding those instrumental values needed to achieve them.
Well, then please reconsider. Humans intrinsically value warmth, sweetness, fullness, orgasms, the absence of pain—and various other things. They instrumentally value money, qualifications, property rights, and so on. Mostly the instrumental values arise out of the intrinsic values—in the context of some environment.
There may be some wrinkles to this kind of model. There may be instinctive predispositions towards some instrumental values. There may be instrumental values that only develop as a result of certain types of interaction with the environment. However, overall, I fail to see how arguing with the significance of the instrumental / intrinsic value split is productive.
I am pretty sure that any more sophisticated model would still exhibit the same instrumental / intrinsic value division.
I don’t think you have the same notion of instrumental values in mind that everyone else has. Instrumental values aren’t values, properly. If X is an instrumental value, all this means is that X is useful for obtaining our values. Examples of instrumental values: money, natural resources, or computing power.
The problem with distinguishing instrumental and terminal values in humans is that it isn’t always clear whether you really do value, for example, justice, or if it’s just a heuristic for obtaining that which we do value.
PhilGoetz seems to be using “parameter” to mean “instrumental value” and “value” to mean “terminal value”.
If “parameter” were “instrumental value”, that would mean you could set the parameter according to your values. That would be very similar to saying it was part of your values.
The notion of “value” vs. “instrumental value” is probably bankrupt. It’s very similar to the dichotomy of “grounded symbol vs. dependent symbol”. I talked about this in the section “Value is a network problem” of my regrettably long post on values.
In this particular case, I’m also presenting the option that “parameter” is something you can place logical constraints on, regardless of values. Like math: You can’t say that believing that 1+1=3 is one of your values.
Answer to Wei Dai: It’s a parameter you need to set to implement your values; but I don’t know how you set it. That’s the problem I’m pointing out. Resolving whether this was an instrumental value or not would solve the problem.
If I understand you correctly, you’re thinking of utility functions as f(p,U), were U is the state of the universe and p is some parameter. Here f is your values and p is the parameter. If that’s the case, I don’t understand the distinction your making by separating out p, i.e., why not just work with F(U)=f(p,U). If there’s uncertainty about the value of p, how is that different from uncertainty about your utility function?
That is the question posed by the original post. There is a long history of assuming that it makes sense to talk about using the same utility function with different parameters.
Well, instrumental value is a totally standard concept:
http://en.wikipedia.org/wiki/Instrumental_value
It is different from the equally-standard concept of intrinsic value:
http://en.wikipedia.org/wiki/Intrinsic_value_%28ethics%29
...so: it is not very clear what you mean.
I think what you are saying is that because instrumental values arise out of intrinsic values we need not bother too much in distinguishing between them.
However, that isn’t right—there’s at least one important time to know what is an instrumental value and what is an intrinsic value—namely when self-improving.
No, I really think instrumental vs. intrinsic values is a bankrupt set of ideas, despite being standard. Our values were not generated by starting with a set of intrinsic values, then adding those instrumental values needed to achieve them. Similarly, I don’t think that we learn language by learning a set of fundamental words, and then defining all other words in terms of them. The link to my regrettably long post on values above is supposed to elucidate this (though I’m not sure it’s any clearer than what I just said).
Well, then please reconsider. Humans intrinsically value warmth, sweetness, fullness, orgasms, the absence of pain—and various other things. They instrumentally value money, qualifications, property rights, and so on. Mostly the instrumental values arise out of the intrinsic values—in the context of some environment.
There may be some wrinkles to this kind of model. There may be instinctive predispositions towards some instrumental values. There may be instrumental values that only develop as a result of certain types of interaction with the environment. However, overall, I fail to see how arguing with the significance of the instrumental / intrinsic value split is productive.
I am pretty sure that any more sophisticated model would still exhibit the same instrumental / intrinsic value division.
I don’t think you have the same notion of instrumental values in mind that everyone else has. Instrumental values aren’t values, properly. If X is an instrumental value, all this means is that X is useful for obtaining our values. Examples of instrumental values: money, natural resources, or computing power.
The problem with distinguishing instrumental and terminal values in humans is that it isn’t always clear whether you really do value, for example, justice, or if it’s just a heuristic for obtaining that which we do value.