Note that human moral intuitions did not evolve to work well in transhumanist fantasies and dystopias commonly discussed here (that’s one reason why an AGI with human morality built-in would almost necessarily turn unfriendly). Thus before you can pronounce that “torture is bad”, you have to carefully define the terms “torture”, “is” and “bad” to make sense in your imagined setting. An earnest attempt to do that is likely to lead you deeper into metaethics, epistemology and AI research. Until then, your question is meaningless.
Note that human moral intuitions did not evolve to work well in transhumanist fantasies and dystopias commonly discussed here (that’s one reason why an AGI with human morality built-in would almost necessarily turn unfriendly). Thus before you can pronounce that “torture is bad”, you have to carefully define the terms “torture”, “is” and “bad” to make sense in your imagined setting. An earnest attempt to do that is likely to lead you deeper into metaethics, epistemology and AI research. Until then, your question is meaningless.