I think this is probably the correct answer. If a simulation obtains the correct result, it is simulating the mind in some form, even while shuffling computations between obfuscatory boxes. The notion that adding physical states somehow changes this is a red herring. If I translate you 10 m to the left, you don’t stop being a mind.
If a simulation obtains the correct result, it is simulating the mind in some form
However, this conclusion doesn’t have implications for morality: if I (as a human) think about how’d feel if I torture you, I get the correct answer. It doesn’t make my thinking a condemnable act.
Except you don’t get the correct answer when you think about it. Your simulation is incredibly abstracted, and your model of the victim does not have much, if any, moral weight. If you had the ability to calculate a precise mindstate, that’s the level of simulation that would require moral calculus.
I think this is probably the correct answer. If a simulation obtains the correct result, it is simulating the mind in some form, even while shuffling computations between obfuscatory boxes. The notion that adding physical states somehow changes this is a red herring. If I translate you 10 m to the left, you don’t stop being a mind.
However, this conclusion doesn’t have implications for morality: if I (as a human) think about how’d feel if I torture you, I get the correct answer. It doesn’t make my thinking a condemnable act.
Except you don’t get the correct answer when you think about it. Your simulation is incredibly abstracted, and your model of the victim does not have much, if any, moral weight. If you had the ability to calculate a precise mindstate, that’s the level of simulation that would require moral calculus.