I’d certainly be more willing to assign that label to Clippy’s system than to Kifs’. Though perhaps Clippy is too well-adjusted—if its preferences are identical to its morality, instead of merely having its morality influence its preferences, that may still be slightly too different than the way morality motivates humans to be given that label.
I’d feel better calling Clippy’s clippyness “morality” if it occasionally made staples and then felt bad about it, or it atleast had a personal preference for yellow paperclips in its vicinity while allowing that paperclips of other colors are just as clippy.
That’s an interesting distinction. So a paperclip maximizer would seem to be an entity with a moral system.
I’d certainly be more willing to assign that label to Clippy’s system than to Kifs’. Though perhaps Clippy is too well-adjusted—if its preferences are identical to its morality, instead of merely having its morality influence its preferences, that may still be slightly too different than the way morality motivates humans to be given that label.
I’d feel better calling Clippy’s clippyness “morality” if it occasionally made staples and then felt bad about it, or it atleast had a personal preference for yellow paperclips in its vicinity while allowing that paperclips of other colors are just as clippy.