Trust between partners do not happen overnight. You don’t suddenly begin sharing information with concurrents when the prize is in sight. We need a history of shared information to build upon, and now—when, as you said, AGI is not really close—is the good time to build it. Because if you don’t trust someone with GPT-3, you are certainly not going to trust them with an AGI.
Because if you don’t trust someone with GPT-3, you are certainly not going to trust them with an AGI.
Choosing to not release GPT-3′s weights to the whole world doesn’t imply that you don’t trust DeepMind or Anthropic or whoever. It just implies that there exists at least one person in the world you don’t trust.
I agree that releasing everything publicly would make it easier/more likely to release crucial things to key competitors when the time comes. Alas, the harms are big enough to outweigh this benefit, I think.
Trust between partners do not happen overnight. You don’t suddenly begin sharing information with concurrents when the prize is in sight. We need a history of shared information to build upon, and now—when, as you said, AGI is not really close—is the good time to build it.
Because if you don’t trust someone with GPT-3, you are certainly not going to trust them with an AGI.
Choosing to not release GPT-3′s weights to the whole world doesn’t imply that you don’t trust DeepMind or Anthropic or whoever. It just implies that there exists at least one person in the world you don’t trust.
I agree that releasing everything publicly would make it easier/more likely to release crucial things to key competitors when the time comes. Alas, the harms are big enough to outweigh this benefit, I think.