How’d they end up with the same premises and different conclusions?
They care about doing what is morally right, but they have different values. The baby-eaters, for example, thought it was morally right to optimize whatever they were optimizing with eating the babies, but didn’t particularly value their babies’ well-being.
Er, you might have missed the ancestor of this thread. In the conflict between fundamentally different systems of preference and value (more different than those of any two humans), it’s probably more confusing than helpful to use the word “should” with the other one. Thus we might introduce another word, should2, which stands in relation to the aliens’ mental constitution (etc) as should stands to ours.
This distinction is very helpful, because we might (for example) conclude from our moral reasoning that we should respect their moral values, and then be surprised that they don’t reciprocate, if we don’t realize that that aspect of should needn’t have any counterpart in should2. If you use the same word, you might waste time trying to argue that the aliens should do this or respect that, applying the kind of moral reasoning that is valid in extrapolating should; when they don’t give a crap for what they should do, they’re working out what they should2 do.
They care about doing what is morally right, but they have different values. The baby-eaters, for example, thought it was morally right to optimize whatever they were optimizing with eating the babies, but didn’t particularly value their babies’ well-being.
Er, you might have missed the ancestor of this thread. In the conflict between fundamentally different systems of preference and value (more different than those of any two humans), it’s probably more confusing than helpful to use the word “should” with the other one. Thus we might introduce another word, should2, which stands in relation to the aliens’ mental constitution (etc) as should stands to ours.
This distinction is very helpful, because we might (for example) conclude from our moral reasoning that we should respect their moral values, and then be surprised that they don’t reciprocate, if we don’t realize that that aspect of should needn’t have any counterpart in should2. If you use the same word, you might waste time trying to argue that the aliens should do this or respect that, applying the kind of moral reasoning that is valid in extrapolating should; when they don’t give a crap for what they should do, they’re working out what they should2 do.
(This is more or less the same argument as in Moral Error and Moral Disagreement, I think.)
I’m not sure. How can there be any confusion when I say they “do care about doing what they think they should?” I clearly mean should2 here.
I think it’s perfectly clear. Eliezer seems to disapprove of this usage and I think he claims that it is not clear, but I’m less sure of that.
I propose that a moral relativist is someone who like this usage.