My general principle here is a generalization of the foundations of tort law—if you do an act that causes harm, in a way that’s reasonably foreseeable, you are responsible for that.
By current tort law, products modified by an end user wouldn’t usually make the manufacturer liable.
Refrain from initiating the commercial, public, or widespread use of a covered model if there remains an unreasonable risk that an individual may be able to use the hazardous capabilities of the model, or a derivative model based on it
Is it your view that there is a substantial list of capabilities it should be legal to freely distribute an AI model with, but which would rightly be illegal to hire a person to do?
I don’t know. The “business as usual” script would be to say there should be few limits. It is legal to freely distribute a CNC machine, a printer, a laser cutter. All of these machines will do whatever the user instructs, legal or not, and it’s common practice for components like door safety switches to be simple and straightforward to bypass—the manufacturer won’t be responsible if the user bypasses a safety mechanism deliberately. There are some limits, printers and scanners and image manipulation software will check for US currency. But open software that can be easily modified to remove the limits is available. https://www.reddit.com/r/GIMP/comments/3c7i55/does_gimp_have_this_security_feature/
I think it’s an empirical question whether any particular AI model is reasonably foreseeable to cause harm.
The reason they say the rules are written in blood is because you must wait for a harm to happen first, and then pass laws after. Or you will be at a competitive disadvantage, which is what this law may cause.
By current tort law, products modified by an end user wouldn’t usually make the manufacturer liable.
I don’t know. The “business as usual” script would be to say there should be few limits. It is legal to freely distribute a CNC machine, a printer, a laser cutter. All of these machines will do whatever the user instructs, legal or not, and it’s common practice for components like door safety switches to be simple and straightforward to bypass—the manufacturer won’t be responsible if the user bypasses a safety mechanism deliberately. There are some limits, printers and scanners and image manipulation software will check for US currency. But open software that can be easily modified to remove the limits is available. https://www.reddit.com/r/GIMP/comments/3c7i55/does_gimp_have_this_security_feature/
The reason they say the rules are written in blood is because you must wait for a harm to happen first, and then pass laws after. Or you will be at a competitive disadvantage, which is what this law may cause.