Eh, if people want to copy human typical (I’ll call it “folk”) morality, that probably won’t end too badly, and it seems like good practice for modeling other complicated human thought patterns.
Whether it’s the right morality to try and get machines to use or not gets pretty meta-ethical. However, if the audience is moved by consistency you might use a trolley-problem analogy to claim that building a computer is analogous to throwing a switch and so by folk morality you should be more consequentialist, so making a computer that handles the trolley problem using folk morality is wrong if folk morality is right, and also wrong if folk morality is wrong.
Eh, if people want to copy human typical (I’ll call it “folk”) morality, that probably won’t end too badly, and it seems like good practice for modeling other complicated human thought patterns.
Whether it’s the right morality to try and get machines to use or not gets pretty meta-ethical. However, if the audience is moved by consistency you might use a trolley-problem analogy to claim that building a computer is analogous to throwing a switch and so by folk morality you should be more consequentialist, so making a computer that handles the trolley problem using folk morality is wrong if folk morality is right, and also wrong if folk morality is wrong.