In that proposal? The AI is motivated to kill the human to prevent any possible tampering with the shutdown circuitry. If we’ve defined the setup so that someone needs to actively press a button at some point, then killing the human and getting an automated button presser will work.
Protect the circuity doesn’t mean protect the human component of it, unless the human component is defined.
In that proposal? The AI is motivated to kill the human to prevent any possible tampering with the shutdown circuitry. If we’ve defined the setup so that someone needs to actively press a button at some point, then killing the human and getting an automated button presser will work.
Protect the circuity doesn’t mean protect the human component of it, unless the human component is defined.
Makes sense, thanks for clarifying.