However, it happens that the vast majority kinds of possible minds don’t give a crap about morality, and while they might agree with us about what they should do, they wouldn’t find that motivating.
What about the minds that disagree with us about what they should do, and yet do care about doing what they think they should? Would your position hold that it is unlikely for them to have a different list or that they must be mistaken about the list—that caring about what you “should” do means having the list we have?
What about the minds that disagree with us about what they should do, and yet do care about doing what they think they should?
How’d they end up with the same premises and different conclusions? Broken reasoning about implications, like the human practice of rationalization? Bad empirical pictures of the physical universe leading to poor policy? If so, that all sounds like a perfectly ordinary situation.
How’d they end up with the same premises and different conclusions?
They care about doing what is morally right, but they have different values. The baby-eaters, for example, thought it was morally right to optimize whatever they were optimizing with eating the babies, but didn’t particularly value their babies’ well-being.
Er, you might have missed the ancestor of this thread. In the conflict between fundamentally different systems of preference and value (more different than those of any two humans), it’s probably more confusing than helpful to use the word “should” with the other one. Thus we might introduce another word, should2, which stands in relation to the aliens’ mental constitution (etc) as should stands to ours.
This distinction is very helpful, because we might (for example) conclude from our moral reasoning that we should respect their moral values, and then be surprised that they don’t reciprocate, if we don’t realize that that aspect of should needn’t have any counterpart in should2. If you use the same word, you might waste time trying to argue that the aliens should do this or respect that, applying the kind of moral reasoning that is valid in extrapolating should; when they don’t give a crap for what they should do, they’re working out what they should2 do.
What about the minds that disagree with us about what they should do, and yet do care about doing what they think they should? Would your position hold that it is unlikely for them to have a different list or that they must be mistaken about the list—that caring about what you “should” do means having the list we have?
How’d they end up with the same premises and different conclusions? Broken reasoning about implications, like the human practice of rationalization? Bad empirical pictures of the physical universe leading to poor policy? If so, that all sounds like a perfectly ordinary situation.
They care about doing what is morally right, but they have different values. The baby-eaters, for example, thought it was morally right to optimize whatever they were optimizing with eating the babies, but didn’t particularly value their babies’ well-being.
Er, you might have missed the ancestor of this thread. In the conflict between fundamentally different systems of preference and value (more different than those of any two humans), it’s probably more confusing than helpful to use the word “should” with the other one. Thus we might introduce another word, should2, which stands in relation to the aliens’ mental constitution (etc) as should stands to ours.
This distinction is very helpful, because we might (for example) conclude from our moral reasoning that we should respect their moral values, and then be surprised that they don’t reciprocate, if we don’t realize that that aspect of should needn’t have any counterpart in should2. If you use the same word, you might waste time trying to argue that the aliens should do this or respect that, applying the kind of moral reasoning that is valid in extrapolating should; when they don’t give a crap for what they should do, they’re working out what they should2 do.
(This is more or less the same argument as in Moral Error and Moral Disagreement, I think.)
I’m not sure. How can there be any confusion when I say they “do care about doing what they think they should?” I clearly mean should2 here.
I think it’s perfectly clear. Eliezer seems to disapprove of this usage and I think he claims that it is not clear, but I’m less sure of that.
I propose that a moral relativist is someone who like this usage.