There’s nothing inherently wrong with simulating intelligent beings, so long as you don’t make them suffer. If you simulate an intelligent being and give it a life significantly worse than you could, well, that’s a bit ethically questionable. If we had the power to simulate someone, and we chose to simulate him in a world much like our own, including all the strife, trouble, and pain of this world, when we could have just as easily simulated him in a strictly better world, then I think it would be reasonable to say that you, the simulator, are morally responsible for all that additional suffering.
There’s nothing inherently wrong with simulating intelligent beings, so long as you don’t make them suffer. If you simulate an intelligent being and give it a life significantly worse than you could, well, that’s a bit ethically questionable. If we had the power to simulate someone, and we chose to simulate him in a world much like our own, including all the strife, trouble, and pain of this world, when we could have just as easily simulated him in a strictly better world, then I think it would be reasonable to say that you, the simulator, are morally responsible for all that additional suffering.
Agree, but I’d like to point out that “just as easily” hides some subtlety in this claim.