In your answer you introduced a new term, which wasn’t present in parent’s description of the situation: “reward”. What if this superintelligent machine doesn’t have any “reward”? If it really works exactly as described by the parent?
My use of reward was just shorthand for whatever signals it needs to receive to consider its goal met. At some point it has to receive electrical signals to quantify that its reward is met, right? So why wouldn’t it just manipulate those electrical signals to match whatever its goal is?
In your answer you introduced a new term, which wasn’t present in parent’s description of the situation: “reward”. What if this superintelligent machine doesn’t have any “reward”? If it really works exactly as described by the parent?
My use of reward was just shorthand for whatever signals it needs to receive to consider its goal met. At some point it has to receive electrical signals to quantify that its reward is met, right? So why wouldn’t it just manipulate those electrical signals to match whatever its goal is?