Peter Thiel: So you’re both thinking it will all fundamentally work out.
Scott Brown: Yes, but not in a wishful thinking way. We need to treat our work with the reverence you’d give to building bombs or super-viruses. At the same time, I don’t think hard takeoff scenarios like Skynet are likely. We’ll start with big gains in a few areas, society will adjust, and the process will repeat.
I don’t believe anyone working on AI is actually treating it that way. I do hope, however, that whenever there are signs of a possible breakthrough, researchers will stop, assess what they have very carefully, and build a lot of safety features before doing any more development. Most important of all, I hope that whoever makes the key discoveries does not publish their results in a way that would enable more reckless groups to copy them
I didn’t say that researchers should publish binaries without source code, I said they should hold off on publishing at all. This isn’t about open vs. closed source.
Open source is about publishing the code (and allowing it to be reused). You’re talking about not publishing the code. Plenty of software companies don’t publish binaries (e.g. Google, Facebook). Binaries or no, it’s not open source if you don’t even publish the code.
Nevertheless, when you have the binary, you stand a chance at reverse engineering. If you broadcast such a binary, you have a guaranteed leak. At least, when you don’t publish at all, you stand a chance at actual secrecy. (Pretty unlikely, though, if too much people are involved.)
In software development, secrecy is often undesirable: nobody trusts you; nobody will work with you; nobody can help you—you are pretty screwed. Thus, all the OSS in the world’s infrastructure these days.
I don’t believe anyone working on AI is actually treating it that way. I do hope, however, that whenever there are signs of a possible breakthrough, researchers will stop, assess what they have very carefully, and build a lot of safety features before doing any more development. Most important of all, I hope that whoever makes the key discoveries does not publish their results in a way that would enable more reckless groups to copy them
I expect the military and megacorps will be the biggest advocates of closed source machine intelligence software.
That’s what happened when the government tried to keep cryptography out of citizens’ hands, anyway.
Such efforts are ultimately futile, but they do sometimes act to slow progress down—thereby helping those with early access attain their own goals.
I didn’t say that researchers should publish binaries without source code, I said they should hold off on publishing at all. This isn’t about open vs. closed source.
Open source is about publishing the code (and allowing it to be reused). You’re talking about not publishing the code. Plenty of software companies don’t publish binaries (e.g. Google, Facebook). Binaries or no, it’s not open source if you don’t even publish the code.
Nevertheless, when you have the binary, you stand a chance at reverse engineering. If you broadcast such a binary, you have a guaranteed leak. At least, when you don’t publish at all, you stand a chance at actual secrecy. (Pretty unlikely, though, if too much people are involved.)
In software development, secrecy is often undesirable: nobody trusts you; nobody will work with you; nobody can help you—you are pretty screwed. Thus, all the OSS in the world’s infrastructure these days.