Moral questions are terminal. Ethical questions are instrumental.
I would argue that ethics are values that are instrumental, but treated as if they were terminal for almost all real object-level decisions. Ethics are a human cognitive shortcut. We need ethics because we can’t really compute the expected cost of a black swan bet. An AI without our limitations might not need ethics. It might be able to keep all it’s instrumental values in it’s head as instrumental, without getting confused like we would.
I would argue that ethics are values that are instrumental, but treated as if they were terminal for almost all real object-level decisions. Ethics are a human cognitive shortcut. We need ethics because we can’t really compute the expected cost of a black swan bet. An AI without our limitations might not need ethics. It might be able to keep all it’s instrumental values in it’s head as instrumental, without getting confused like we would.