If the AI created enough simulations, it could potentially be more altruistic not to.
On the other hand pressing “reset” or smashing the computer should stop the torture, necessarily making it more altruistic if humanity lives forever, versus not if ems are otherwise unobtainable and humanity is doomed.
I was assuming a reasonable chance at humanity developing an FAI given the containment of this rogue AI. This small chance, multiplied by all the good that an FAI could do with the entire galaxy, let alone the universe, should outweigh the bad that can be done within Earth-bound computational processes.
I believe that a less convenient world that counters this point would take the problem out of the interesting context.
If the AI created enough simulations, it could potentially be more altruistic not to.
On the other hand pressing “reset” or smashing the computer should stop the torture, necessarily making it more altruistic if humanity lives forever, versus not if ems are otherwise unobtainable and humanity is doomed.
I was assuming a reasonable chance at humanity developing an FAI given the containment of this rogue AI. This small chance, multiplied by all the good that an FAI could do with the entire galaxy, let alone the universe, should outweigh the bad that can be done within Earth-bound computational processes.
I believe that a less convenient world that counters this point would take the problem out of the interesting context.