This is cogent and forceful, but still wrong I think. There’s something to morality beyond the presence of a planning algorithm. I can’t currently imagine what that might be tho, so maybe you’re right that the difference is one of degree and not kind.
I think part of the confusion is that Elizier is distinguishing morality as a particular aspect of human decision-making. A lot of the comments seem to want to include any decision-making criteria as a kind of generalized morality.
Morality may just be a deceptively simple word that covers extremely complex aspects of how humans choose, and justify, their actions.
This is cogent and forceful, but still wrong I think. There’s something to morality beyond the presence of a planning algorithm. I can’t currently imagine what that might be tho, so maybe you’re right that the difference is one of degree and not kind.
I think part of the confusion is that Elizier is distinguishing morality as a particular aspect of human decision-making. A lot of the comments seem to want to include any decision-making criteria as a kind of generalized morality.
Morality may just be a deceptively simple word that covers extremely complex aspects of how humans choose, and justify, their actions.