If the problem is a programmer who tried to give it a sense of morality but ended up using a fake utility function or just plain screwing up, he might well end with a With Folded Hands scenario or Parfit’s Mere Addition Paradox (I remember Eliezer saying once—imagine if we get an AI that understands everything perfectly except freedom) . And that’s just the complicated failure—the simple one is that the government of Communist China develops the Singularity AI and programs it to do whatever they say.
For whatever relief it’s worth, someone who thought that was a good idea would have a good chance of building a paperclipper instead. “There is a limit to how competent you can be, and still be that stupid.”