If that was the case it would not be a utility monster. It would be a bunch of people piloting a giant robot that is capable of birthing more people. A utility monster is supposed to be one distinct individual.
Your ethical theory is in deep trouble if it depends on a notion of ‘distinct individual’ in any crucial way. It is easy to imagine scenarios where there is a continuous path from robot-piloting people to one giant hive mind. (Kaj wrote a whole paper about such stuff: Coalescing minds: Brain uploading-related group mind scenarios) Or we can split brain hemispheres and give both of them their own robotic bodies.
I imagine it is possible to develop some ethical theory that could handle creatures capable of merging and splitting. One possibility might be to count “utility functions” instead of individuals. This, would, of course result in weird questions like if two people’s preferences stop counting when they merge and then count again when they split. But at least it would stop someone from giving themselves a moral right to everything by making enough ems of themself.
Again, this idea probably has problems that need to be worked out. I very much doubt than I could figure out all the ethical implications in one response when Kaj wasn’t able to in a huge paper. But I don’t think it’s an insurmountable problem.
Your ethical theory is in deep trouble if it depends on a notion of ‘distinct individual’ in any crucial way. It is easy to imagine scenarios where there is a continuous path from robot-piloting people to one giant hive mind. (Kaj wrote a whole paper about such stuff: Coalescing minds: Brain uploading-related group mind scenarios) Or we can split brain hemispheres and give both of them their own robotic bodies.
I imagine it is possible to develop some ethical theory that could handle creatures capable of merging and splitting. One possibility might be to count “utility functions” instead of individuals. This, would, of course result in weird questions like if two people’s preferences stop counting when they merge and then count again when they split. But at least it would stop someone from giving themselves a moral right to everything by making enough ems of themself.
Again, this idea probably has problems that need to be worked out. I very much doubt than I could figure out all the ethical implications in one response when Kaj wasn’t able to in a huge paper. But I don’t think it’s an insurmountable problem.