Any time you demand a license for something, that amounts to “if you don’t do as demanded, men with guns will come and if you continue to disobey, will kill you or lock you in a cage”. That should be enough to oppose this proposal. We’ve been through this before with cryptography rules, and even those were only about export.
Not to mention that even that assumes that the people handing out the licenses will be good at their job. If you actually implement this, you’re not going to get licenses that require expertise; you’re going to get licenses that require that you follow whatever arbitrary rule some committee and pressure groups have cooked up. The AI license is far more likely to be used to prevent business competition and to sound good in a political speech than to promote AI safety.
And using an international organization is even worse. The last thing I want is China and Russia getting a vote on what software I’m allowed to have.
Any time you demand a license for something, that amounts to “if you don’t do as demanded, men with guns will come and if you continue to disobey, will kill you or lock you in a cage”. That should be enough to oppose this proposal. We’ve been through this before with cryptography rules, and even those were only about export.
Not to mention that even that assumes that the people handing out the licenses will be good at their job. If you actually implement this, you’re not going to get licenses that require expertise; you’re going to get licenses that require that you follow whatever arbitrary rule some committee and pressure groups have cooked up. The AI license is far more likely to be used to prevent business competition and to sound good in a political speech than to promote AI safety.
And using an international organization is even worse. The last thing I want is China and Russia getting a vote on what software I’m allowed to have.