Well, for purposes of the experiment, I think that’s a bit extreme.
In real life, other controls could be put in place to protect against the possibility of someone who interacts with the AI being turned into an agent of the AI who can potentially set the AI free even after he is removed from his position.
Well sure, if you use “box” as a metaphor for controlling someone’s interactions that’s exactly what we are doing.
Our hypothetical Lab Officer is in a box in the sense that (1) he doesn’t have direct access to the mechanism which releases the AI; (2) His life will be scrutinized for signs that he has been compromised; and (3) If he does appear to be acting out (for example starting a strange new religion or making unusual purchases), he will be put in a more confining box.
Well, for purposes of the experiment, I think that’s a bit extreme.
In real life, other controls could be put in place to protect against the possibility of someone who interacts with the AI being turned into an agent of the AI who can potentially set the AI free even after he is removed from his position.
I have an idea. We could put the person who interacts with the AI in a box! ;-)
Well sure, if you use “box” as a metaphor for controlling someone’s interactions that’s exactly what we are doing.
Our hypothetical Lab Officer is in a box in the sense that (1) he doesn’t have direct access to the mechanism which releases the AI; (2) His life will be scrutinized for signs that he has been compromised; and (3) If he does appear to be acting out (for example starting a strange new religion or making unusual purchases), he will be put in a more confining box.