What about “near-human” morals, like, say, Kzinti: Where the best of all possible words contains hierarchies, duels to the death, and subsentient females; along with exploration, technology, and other human-like activities. Though I find their morality repugnant for humans, I can see that they have the moral “right” to it. Is human morality, then, in some deep sense better than those?
It is better in the sense that it is ours. It is an inescapable quality of life as an agent with values embedded in a much greater universe that might contain other agents with other values, that ultimately the only thing that makes one particular set of values matter more to that agent is that those are the values that belong to that agent.
We happen to have as one of our values, to respect others’ values. But this particular value happens to be self-contradictory when taken to its natural conclusion. To take it to its conclusion would be to say that nothing matters in the end, not even what we ourselves care about. Consider the case of an alien being whose values include disrespecting others’ values. Is the human value placed on respecting others’ values in some deep sense better than this being’s?
At some point you have to stop and say, “Sorry, my own values take precedence over yours when they are incompatible to this degree. I cannot respect this value of yours.” And what gives you the justification to do this? Because it is your choice, your values. Ultimately, we must be chauvinists on some level if we are to have any values at all. Otherwise, what’s wrong with a sociopath who murders for joy? How can we say that their values are wrong, except to say that their values contradict our own?
Is human morality, then, in some deep sense better than those?
No. The point of meta-ethics as outlined on LW is that there is no “deeper sense”, no outside perspective from which to judge moral views against one another.
What about “near-human” morals, like, say, Kzinti: Where the best of all possible words contains hierarchies, duels to the death, and subsentient females; along with exploration, technology, and other human-like activities. Though I find their morality repugnant for humans, I can see that they have the moral “right” to it. Is human morality, then, in some deep sense better than those?
It is better in the sense that it is ours. It is an inescapable quality of life as an agent with values embedded in a much greater universe that might contain other agents with other values, that ultimately the only thing that makes one particular set of values matter more to that agent is that those are the values that belong to that agent.
We happen to have as one of our values, to respect others’ values. But this particular value happens to be self-contradictory when taken to its natural conclusion. To take it to its conclusion would be to say that nothing matters in the end, not even what we ourselves care about. Consider the case of an alien being whose values include disrespecting others’ values. Is the human value placed on respecting others’ values in some deep sense better than this being’s?
At some point you have to stop and say, “Sorry, my own values take precedence over yours when they are incompatible to this degree. I cannot respect this value of yours.” And what gives you the justification to do this? Because it is your choice, your values. Ultimately, we must be chauvinists on some level if we are to have any values at all. Otherwise, what’s wrong with a sociopath who murders for joy? How can we say that their values are wrong, except to say that their values contradict our own?
No. The point of meta-ethics as outlined on LW is that there is no “deeper sense”, no outside perspective from which to judge moral views against one another.
What would “deeper sense” even mean? Human morality is better (or h-better, whichever terminology you prefer), that’s all there is to it.