I’d say the torture happened once. Even if you make more flipbooks and it changes the measure of the subjective experience, there is only one unique experience. The experience doesn’t know if it happened before.
Once the system is closed, I’d think it is morally same for the experience to be simulated once or many times.
You’re no more torturing them again than you are killing them again and again when the flipbook finishes its calculation.
I happen to agree with you 100%, but let me note that this line of reasoning has some strange conclusions. It implies that it is the same to torture one computer-simulated consciousness to torturing 100 clones of him at the same time the same way. But when one of the simulations has an accidental bit-flip due to hardware error, it is not the same anymore. Similarly, if you torture 100 different computer-simulated consciousnesses by a deterministic process, but during the simulation two of them become identical, it means that now there are only 99 people tortured.
I’m undecided on how to treat ‘running the exact same torture sim (say as a flipbook of instructions)’ but I’m leaning towards it being increasingly morally worse the more time one runs the simulation because of one thing that sticks out to me: that if you complete the torture sim then ask the person in the sim if they think they’re a person, if they think it’s okay to torture them because they’re a copy, etc they’re going to have every reason/argument a human in meatspace has against torture being done to them.
I’d say the torture happened once. Even if you make more flipbooks and it changes the measure of the subjective experience, there is only one unique experience. The experience doesn’t know if it happened before.
Once the system is closed, I’d think it is morally same for the experience to be simulated once or many times.
You’re no more torturing them again than you are killing them again and again when the flipbook finishes its calculation.
I happen to agree with you 100%, but let me note that this line of reasoning has some strange conclusions. It implies that it is the same to torture one computer-simulated consciousness to torturing 100 clones of him at the same time the same way. But when one of the simulations has an accidental bit-flip due to hardware error, it is not the same anymore. Similarly, if you torture 100 different computer-simulated consciousnesses by a deterministic process, but during the simulation two of them become identical, it means that now there are only 99 people tortured.
I’m undecided on how to treat ‘running the exact same torture sim (say as a flipbook of instructions)’ but I’m leaning towards it being increasingly morally worse the more time one runs the simulation because of one thing that sticks out to me: that if you complete the torture sim then ask the person in the sim if they think they’re a person, if they think it’s okay to torture them because they’re a copy, etc they’re going to have every reason/argument a human in meatspace has against torture being done to them.