Well, I think one of the minimal elements required to identify something as morality, is that one tends to prefer other people to be moral, at least in general, at least when their immorality doesn’t help you directly.
Babyeaters wanted other people to eat babies, and superhappies wanted other people to..superhappy, but Kifs don’t seem to have any reason to encourage sfik-seeking in others.
Kif find sfik-seeking entities to be easier to predict, and therefore easier to control. Thus they prefer sfik-seeking behaviour in others, for purely practical reasons.
I’d certainly be more willing to assign that label to Clippy’s system than to Kifs’. Though perhaps Clippy is too well-adjusted—if its preferences are identical to its morality, instead of merely having its morality influence its preferences, that may still be slightly too different than the way morality motivates humans to be given that label.
I’d feel better calling Clippy’s clippyness “morality” if it occasionally made staples and then felt bad about it, or it atleast had a personal preference for yellow paperclips in its vicinity while allowing that paperclips of other colors are just as clippy.
So how does one distinguish a system of motivations from being a system of morality or not?
Well, I think one of the minimal elements required to identify something as morality, is that one tends to prefer other people to be moral, at least in general, at least when their immorality doesn’t help you directly.
Babyeaters wanted other people to eat babies, and superhappies wanted other people to..superhappy, but Kifs don’t seem to have any reason to encourage sfik-seeking in others.
Kif find sfik-seeking entities to be easier to predict, and therefore easier to control. Thus they prefer sfik-seeking behaviour in others, for purely practical reasons.
That’s an interesting distinction. So a paperclip maximizer would seem to be an entity with a moral system.
I’d certainly be more willing to assign that label to Clippy’s system than to Kifs’. Though perhaps Clippy is too well-adjusted—if its preferences are identical to its morality, instead of merely having its morality influence its preferences, that may still be slightly too different than the way morality motivates humans to be given that label.
I’d feel better calling Clippy’s clippyness “morality” if it occasionally made staples and then felt bad about it, or it atleast had a personal preference for yellow paperclips in its vicinity while allowing that paperclips of other colors are just as clippy.