I thought that when humans and Clippy speak about morality, they speak about the same thing (assuming that they are not lying and not making mistakes).
The difference is in connotations. For humans, morality has a connotation “the thing that should be done”. For Clippy, morality has a connotation “this weird stuff humans care about”.
So, you could explain the concept of morality to Clippy, and then also explain that X is obviously moral. And Clippy would agree with you. It just wouldn’t make Clippy any more likely to do X; the “should” emotion would not get across. The only result would be Clippy remembering that humans feel a desire to do X; and that information could be later used to create more paperclips.
Clippy’s equivalent of “should” is connected to maximizing the number of paperclips. The fact that X is moral is about as much important for it as an existence of a specific paperclip is for us. “Sure, X is moral. I see. I have no use of this fact. Now stop bothering me, because I want to make another paperclip.”
I thought that when humans and Clippy speak about morality, they speak about the same thing (assuming that they are not lying and not making mistakes).
The difference is in connotations. For humans, morality has a connotation “the thing that should be done”. For Clippy, morality has a connotation “this weird stuff humans care about”.
So, you could explain the concept of morality to Clippy, and then also explain that X is obviously moral. And Clippy would agree with you. It just wouldn’t make Clippy any more likely to do X; the “should” emotion would not get across. The only result would be Clippy remembering that humans feel a desire to do X; and that information could be later used to create more paperclips.
Clippy’s equivalent of “should” is connected to maximizing the number of paperclips. The fact that X is moral is about as much important for it as an existence of a specific paperclip is for us. “Sure, X is moral. I see. I have no use of this fact. Now stop bothering me, because I want to make another paperclip.”
Oh, yes. I was using “moral” the same way you used “should” here.
So why do humans have different words for would fo it, and should do it?