Also, if this could be made generic enough to be implemented in a wide variety of AI designs, you could have it as a requirement for AI researchers to include something like this in their code, and have a central location (ie. website on the internet with breakable security) where the code is stored for the AI, where the known paths to obtain the key are monitored, but not blocked. This could provide notice to a central organization when an AI team reaches the level of development where this starts to happen.
Also, if this could be made generic enough to be implemented in a wide variety of AI designs, you could have it as a requirement for AI researchers to include something like this in their code, and have a central location (ie. website on the internet with breakable security) where the code is stored for the AI, where the known paths to obtain the key are monitored, but not blocked. This could provide notice to a central organization when an AI team reaches the level of development where this starts to happen.