For example, suppose a computer program needs to model people very accurately to make some predictions, and it models those people so accurately that the “simulated” people can experience conscious suffering. In a very large computation of this type, millions of people could be created, suffer for some time, and then be destroyed when they are no longer needed for making the predictions desired by the program. This idea was first mentioned by Eliezer Yudkowsky in Nonperson Predicates.
Nitpick: we can date this concern at least as far back as Vernor Vinge’s A Fire Upon the Deep:
Pham Nuwen’s ticket to the Transcend was based on a Power’s sudden interest in the Straumli perversion. This innocent’s ego might end up smeared across a million death cubes, running a million million simulations of human nature.
Nitpick: we can date this concern at least as far back as Vernor Vinge’s A Fire Upon the Deep: