In your case, a force is needed to actually push most of organisations to participate in such project, and the worst ones - which want to make AI first to take over the world—will not participate in it. IAEA is an example of such organisation, but it was not able to stop North Korea to create its nukes.
Because of above you need powerful enforcement agency above your AI agency. It could either use conventional weapons, mostly nukes, or some form of narrow AI, to predict where strong AI is created—or both. Basically, it means the creation of the world government, design especially to contain AI.
It is improbable in the current world, as nobody will create world government mandated to nuke AI labs, based only reading Bostrom and EY books. The only chance for its creation is if some very spectacular AI accident happens, like hacking of 1000 airplanes and crashing them in 1000 nuclear plants using narrow AI with some machine learning capabilities. In that case, global ban of AI seems possible.
In your case, a force is needed to actually push most of organisations to participate in such project, and the worst ones - which want to make AI first to take over the world—will not participate in it. IAEA is an example of such organisation, but it was not able to stop North Korea to create its nukes.
Because of above you need powerful enforcement agency above your AI agency. It could either use conventional weapons, mostly nukes, or some form of narrow AI, to predict where strong AI is created—or both. Basically, it means the creation of the world government, design especially to contain AI.
It is improbable in the current world, as nobody will create world government mandated to nuke AI labs, based only reading Bostrom and EY books. The only chance for its creation is if some very spectacular AI accident happens, like hacking of 1000 airplanes and crashing them in 1000 nuclear plants using narrow AI with some machine learning capabilities. In that case, global ban of AI seems possible.