Isn’t this akin to a protocol for securely monitoring private industry’s experiments in thermonuclear weapons? It’s better than nothing, but when something is dangerous enough, industrial regulation is never strict enough.
Some things are too dangerous to allow private competition in. The only sensible thing to do is nationalize them and have them run exclusively by extremely security-minded government agencies, if at all. And even that might not be good enough for AI, because we’ve never had tech whose base case scenario was “kill everyone”.
Isn’t this akin to a protocol for securely monitoring private industry’s experiments in thermonuclear weapons? It’s better than nothing, but when something is dangerous enough, industrial regulation is never strict enough.
Some things are too dangerous to allow private competition in. The only sensible thing to do is nationalize them and have them run exclusively by extremely security-minded government agencies, if at all. And even that might not be good enough for AI, because we’ve never had tech whose base case scenario was “kill everyone”.