No, unless you mean by taking invasive action like scanning my brain and applying whole brain emulation. It would then quickly learn that I’d consider the action it took to be an unforgivable act in violation of my individual sovereignty,
Shrug. Then let’s take a bunch of people less fussy than you: could a sitiably equipped AI emultate their morlaity better than they can?
Morality isn’t some universal truth written on a stone tablet:
That isn’t fact.
it is individual and unique like a snowflake.
That isn’t a fact either, and doesn’t follow from the above either, since moral nihilism could be true.
If my moral snowflake says I can kick you on your shin, and yours says I can’t, do I get to kick on your shin?
Shrug. Then let’s take a bunch of people less fussy than you: could a sitiably equipped AI emultate their morlaity better than they can?
That isn’t fact.
That isn’t a fact either, and doesn’t follow from the above either, since moral nihilism could be true.
If my moral snowflake says I can kick you on your shin, and yours says I can’t, do I get to kick on your shin?