Imagine a contract. “We the undersigned agree that AGI is a powerful, useful but also potentially dangerous technology. To help avoid needlessly taking the same risk twice we agree that upon development of the worlds first AI, we will stop all attempts to create our own AI on the request of the first AI or its creators. In return the creator of the first AI will be nice with it.”
Then you aren’t stopping all competitors. Your stopping the few people that can’t cooperate.
Imagine a contract. “We the undersigned agree that AGI is a powerful, useful but also potentially dangerous technology. To help avoid needlessly taking the same risk twice we agree that upon development of the worlds first AI, we will stop all attempts to create our own AI on the request of the first AI or its creators. In return the creator of the first AI will be nice with it.”
Then you aren’t stopping all competitors. Your stopping the few people that can’t cooperate.