What morality unit would it have other than human’s volition?
I am not sure what a ‘morality unit’ is supposed to be or how it would be different from a volition unit. Either morality is part of our volition, instrumental or an imperative. In each case one could ask what we want and arrive at morality.
What I’m saying is that: If Clippy tried to calculate our volition, he would conclude that our volition is immoral. (Probably. Maybe our volition IS paperclips.)
But if we programmed an AI to calculate our volition and use that as its volition, and our morality as its morality, and so on, then it would not find our volition immoral unless we find our volition immoral, which seems unlikely.
An AI that was smarter than us might deduce that we were not applying the Deep Structure of our morality properly because of bias or limited intelligence. It might
conclude that human morality requires humans to greatly reduce their numbers
in order to lessen the impact on other species, for instance.
What morality unit would it have other than human’s volition?
If it has another, separate volition unit, yes.
If not, then only if humans fundamentally self-contradict, which seems unlikely, because biological systems are pretty robust to that.
I am not sure what a ‘morality unit’ is supposed to be or how it would be different from a volition unit. Either morality is part of our volition, instrumental or an imperative. In each case one could ask what we want and arrive at morality.
What I’m saying is that: If Clippy tried to calculate our volition, he would conclude that our volition is immoral. (Probably. Maybe our volition IS paperclips.)
But if we programmed an AI to calculate our volition and use that as its volition, and our morality as its morality, and so on, then it would not find our volition immoral unless we find our volition immoral, which seems unlikely.
An AI that was smarter than us might deduce that we were not applying the Deep Structure of our morality properly because of bias or limited intelligence. It might conclude that human morality requires humans to greatly reduce their numbers in order to lessen the impact on other species, for instance.