(I haven’t yet read the things this is responding to.)
Doing AI research as it’s currently done in secret seems difficult, especially if it’s “secret because we said we’d stop but we lied” rather than “secret because we’re temporarily in stealth mode”.
You need to employ people who a) can do the job and b) you trust not to leak that you’re employing them to do a thing you’ve publicly said you wouldn’t do. You can’t easily ask if you can trust them because if the answer is no they already know too much.
Notably, I don’t think this is a case of “you can just have a small handful of people doing this while no one else at the company has any idea”. Though admittedly I don’t know the size of current AI research teams.
The thing that comes to mind to me as a plausible counterexample is the VW emissions scandal, I don’t know (and can’t immediately see on Wikipedia) how many people were involved in that.
(I haven’t yet read the things this is responding to.)
Doing AI research as it’s currently done in secret seems difficult, especially if it’s “secret because we said we’d stop but we lied” rather than “secret because we’re temporarily in stealth mode”.
You need to employ people who a) can do the job and b) you trust not to leak that you’re employing them to do a thing you’ve publicly said you wouldn’t do. You can’t easily ask if you can trust them because if the answer is no they already know too much.
Notably, I don’t think this is a case of “you can just have a small handful of people doing this while no one else at the company has any idea”. Though admittedly I don’t know the size of current AI research teams.
The thing that comes to mind to me as a plausible counterexample is the VW emissions scandal, I don’t know (and can’t immediately see on Wikipedia) how many people were involved in that.