You’ve encrypted a brain, and maybe salted it a bit to boot. You’re still running the brain’s “consciousness” program, it’s just encrypted, and the brain is still experiencing exactly the same things, on account of it is running exactly the same program it would otherwise. The fact that the brain is cryptographically entangled with other data doesn’t make the brain not exist.
I think this is probably the correct answer. If a simulation obtains the correct result, it is simulating the mind in some form, even while shuffling computations between obfuscatory boxes. The notion that adding physical states somehow changes this is a red herring. If I translate you 10 m to the left, you don’t stop being a mind.
If a simulation obtains the correct result, it is simulating the mind in some form
However, this conclusion doesn’t have implications for morality: if I (as a human) think about how’d feel if I torture you, I get the correct answer. It doesn’t make my thinking a condemnable act.
Except you don’t get the correct answer when you think about it. Your simulation is incredibly abstracted, and your model of the victim does not have much, if any, moral weight. If you had the ability to calculate a precise mindstate, that’s the level of simulation that would require moral calculus.
You’ve encrypted a brain, and maybe salted it a bit to boot. You’re still running the brain’s “consciousness” program, it’s just encrypted, and the brain is still experiencing exactly the same things, on account of it is running exactly the same program it would otherwise. The fact that the brain is cryptographically entangled with other data doesn’t make the brain not exist.
I think this is probably the correct answer. If a simulation obtains the correct result, it is simulating the mind in some form, even while shuffling computations between obfuscatory boxes. The notion that adding physical states somehow changes this is a red herring. If I translate you 10 m to the left, you don’t stop being a mind.
However, this conclusion doesn’t have implications for morality: if I (as a human) think about how’d feel if I torture you, I get the correct answer. It doesn’t make my thinking a condemnable act.
Except you don’t get the correct answer when you think about it. Your simulation is incredibly abstracted, and your model of the victim does not have much, if any, moral weight. If you had the ability to calculate a precise mindstate, that’s the level of simulation that would require moral calculus.