I’d like to second what Julian Morrison wrote. Take a human and start disassembling it atom by atom. Do you really expect to construct some meaningful binary predicate that flips from 1 to 0 somewhere along the route?
EY:What if an AI creates millions, billions, trillions of alternative hypotheses, models that are actually people, who die when they are disproven?
If your AI is fully deterministic then any its state can be recreated exactly. Just set loglevel of baby AI inputs to ‘everything’ and hope your supply of write-once-read-many media doesn’t run out before it gets smart enough to provably friendly discard data that isn’t people. Doesn’t solve the problem of suffering, though.
Suppose an AI creates a sandbox and runs a simulated human with a life worth living inside for 50 subjective years (interactions with other people are recorded at their natural borders and we don’t consider merging minds). Then AI destroys the sandbox, recreates it and bit-perfectly reruns the simulation. With the exception of meaningless waste of computing resources, does your morality say this is better/equivalent/makes no difference/worse than restoring a copy from backup?
I’d like to second what Julian Morrison wrote. Take a human and start disassembling it atom by atom. Do you really expect to construct some meaningful binary predicate that flips from 1 to 0 somewhere along the route?
EY:What if an AI creates millions, billions, trillions of alternative hypotheses, models that are actually people, who die when they are disproven? If your AI is fully deterministic then any its state can be recreated exactly. Just set loglevel of baby AI inputs to ‘everything’ and hope your supply of write-once-read-many media doesn’t run out before it gets smart enough to provably friendly discard data that isn’t people. Doesn’t solve the problem of suffering, though.
Suppose an AI creates a sandbox and runs a simulated human with a life worth living inside for 50 subjective years (interactions with other people are recorded at their natural borders and we don’t consider merging minds). Then AI destroys the sandbox, recreates it and bit-perfectly reruns the simulation. With the exception of meaningless waste of computing resources, does your morality say this is better/equivalent/makes no difference/worse than restoring a copy from backup?