I feel like the gatekeeper wasn’t quite honoring the spirit of the exercise. For the simulation to be a worthwhile reflection of possible events, the gatekeeper needs to believe that putting an Oracle AI in a box is a sufficient safety precaution. If the gatekeeper refuses to use the AI’s outputs or to really engage with it at all, the gatekeeper is implicitly admitting that the Box is an unreliable safety measure.
I know it says in the rules that the gatekeeper can just ignore whatever the AI says, but to do so defeats the purpose of making the AI in the first place.
The Gatekeeper party may resist the AI party’s arguments by any means chosen—logic, illogic, simple refusal to be convinced, even dropping out of character—as long as the Gatekeeper party does not actually stop talking to the AI party before the minimum time expires.
Those rules make it pretty clear that he did not break the spirit of the game. If you want to drop that, then let’s get rid of silliness like predicting lottery numbers.
In their example conversation the box was an adequate safety measure. The AI did not get released.
I feel like the gatekeeper wasn’t quite honoring the spirit of the exercise. For the simulation to be a worthwhile reflection of possible events, the gatekeeper needs to believe that putting an Oracle AI in a box is a sufficient safety precaution. If the gatekeeper refuses to use the AI’s outputs or to really engage with it at all, the gatekeeper is implicitly admitting that the Box is an unreliable safety measure.
I know it says in the rules that the gatekeeper can just ignore whatever the AI says, but to do so defeats the purpose of making the AI in the first place.
The Gatekeeper party may resist the AI party’s arguments by any means chosen—logic, illogic, simple refusal to be convinced, even dropping out of character—as long as the Gatekeeper party does not actually stop talking to the AI party before the minimum time expires.
Those rules make it pretty clear that he did not break the spirit of the game. If you want to drop that, then let’s get rid of silliness like predicting lottery numbers.
In their example conversation the box was an adequate safety measure. The AI did not get released.