It very likely isn’t. The purpose of the experiment is to show that people can end up releasing the pretend-AIs even when they go in assuming that keeping AIs in a box is a sufficient safeguard against potential unfriendliness, so you might not want to plan building potentially unfriendly AIs and keeping them in a box as a safeguard.
It very likely isn’t. The purpose of the experiment is to show that people can end up releasing the pretend-AIs even when they go in assuming that keeping AIs in a box is a sufficient safeguard against potential unfriendliness, so you might not want to plan building potentially unfriendly AIs and keeping them in a box as a safeguard.