Perhaps then we should speak of what we want in terms of “terminal values”? For example, I might say that it is a terminal value of mine that I should not murder, or that freedom from authority is good.
But what does “terminal value” mean? Usually, it means that the value of something is not contingent on or derived from other facts or situations, like for example, I may value beautiful things in a way that is not derived from what they get me. The recursive chain of valuableness terminates at some set of values.
… if even the most fundamental-seeming moral feelings are subject to argument, I wonder if there is any coherent sense in which I could be said to have terminal values at all.
TimS mentioned moral anti-realism as one possibility. I have a favorable opinion of desire utilitarianism (search for pros and cons), which is a system that would be compatible with another possibility: real and objective values, but not necessarily any terminal values.
By analogy, such a situation would be a description for moral values like epistemological coherentism (versus foundationalism) describes knowledge. The mental model could be a web rather than a hierarchy. At least it’s a possibility—I don’t intend to argue for or against it right now as I have minimal evidence.
TimS mentioned moral anti-realism as one possibility. I have a favorable opinion of desire utilitarianism (search for pros and cons), which is a system that would be compatible with another possibility: real and objective values, but not necessarily any terminal values.
By analogy, such a situation would be a description for moral values like epistemological coherentism (versus foundationalism) describes knowledge. The mental model could be a web rather than a hierarchy. At least it’s a possibility—I don’t intend to argue for or against it right now as I have minimal evidence.