I can easily imagine that if I ran a simulation of mankind’s evolutionary history, I’d adopt a principle of responding to the requests of simulants given that they are small enough and won’t interfere with the goals of the simulation, just in case they have some awareness. If the purpose of the simulation isn’t simply to satisfy all the simulants’ needs for them (and would in fact be orthogonal to its actual purpose), they would have to make some kind of request for me to do something.
I can easily imagine that if I ran a simulation of mankind’s evolutionary history, I’d adopt a principle of responding to the requests of simulants given that they are small enough and won’t interfere with the goals of the simulation, just in case they have some awareness. If the purpose of the simulation isn’t simply to satisfy all the simulants’ needs for them (and would in fact be orthogonal to its actual purpose), they would have to make some kind of request for me to do something.