Yes, if the coalition is a large fraction of the world then I am saying there is an asymmetry in that the leading project can more easily spy on that large fraction of the world than the other way round. This is because large fractions of the world contain many different people and groups, some of which will leak secrets (or sell secrets) to the leading project, unless extremely and unprecedentedly effective anti-leaking-and-spying measures are implemented across a large fraction of the world. It’s hard but doable for one corporation to keep trade secrets from the rest of the economy; how on earth can the rest of the economy keep trade secrets from a powerful corporation?
I don’t see how I’m arguing that the proportional importance of spying will go down. The proportional importance of spying will go up precisely because it won’t be accelerated as much as AI technology in general will be. (Why don’t I think spying will be accelerated as much as AI technology in general? I certainly agree that spying technology will be accelerated as much or more as AI technology. However I think that spying is a function of several things, only one of which is spying technology, the others being non-technology things like having literal human spies climb through ranks of enemy orgs and also having anti-spying technology.) I envision a future where spying is way more rewarding than any time in history, and yet nevertheless the actual amount of successful spying is less than 10x-100x more than in the past, due to the factors mentioned in the parenthesis.
“The leading project can choose not to sell its technology, but then it just has less money and so falls further and further behind in terms of compute etc. (and at any rate, it needs to be selling something to the other people in order to even be able to afford to use their technology).”
Again, my whole point is that this is only true in the long run. Yes, in the long run a project which relies on other sources of income to buy the things it needs to buy will lose money to projects which sell their innovations. But in the short run empirically it seems that projects can go for years on funding raised from investors and wealthy parent companies. I guess your point is that in a world where the economy is growing super fast due to AI, this won’t be true: any parent company or group of investors capable of funding the leading project at year X will be relative paupers by year X+3 unless their project has been selling its tech. Am I right in understanding you here?
(Miscellanous: I don’t think the leading project differs from all the other projects that develop tech and keep it private. Like Wei Dai said, insofar as a company can charge people to use tech without letting the secrets of how to build that tech escape, they will obviously do so. I think our disagreement is about your last two sentences, which I quoted above.)
Yes, if the coalition is a large fraction of the world then I am saying there is an asymmetry in that the leading project can more easily spy on that large fraction of the world than the other way round. This is because large fractions of the world contain many different people and groups, some of which will leak secrets (or sell secrets) to the leading project, unless extremely and unprecedentedly effective anti-leaking-and-spying measures are implemented across a large fraction of the world. It’s hard but doable for one corporation to keep trade secrets from the rest of the economy; how on earth can the rest of the economy keep trade secrets from a powerful corporation?
I don’t see how I’m arguing that the proportional importance of spying will go down. The proportional importance of spying will go up precisely because it won’t be accelerated as much as AI technology in general will be. (Why don’t I think spying will be accelerated as much as AI technology in general? I certainly agree that spying technology will be accelerated as much or more as AI technology. However I think that spying is a function of several things, only one of which is spying technology, the others being non-technology things like having literal human spies climb through ranks of enemy orgs and also having anti-spying technology.) I envision a future where spying is way more rewarding than any time in history, and yet nevertheless the actual amount of successful spying is less than 10x-100x more than in the past, due to the factors mentioned in the parenthesis.
“The leading project can choose not to sell its technology, but then it just has less money and so falls further and further behind in terms of compute etc. (and at any rate, it needs to be selling something to the other people in order to even be able to afford to use their technology).”
Again, my whole point is that this is only true in the long run. Yes, in the long run a project which relies on other sources of income to buy the things it needs to buy will lose money to projects which sell their innovations. But in the short run empirically it seems that projects can go for years on funding raised from investors and wealthy parent companies. I guess your point is that in a world where the economy is growing super fast due to AI, this won’t be true: any parent company or group of investors capable of funding the leading project at year X will be relative paupers by year X+3 unless their project has been selling its tech. Am I right in understanding you here?
(Miscellanous: I don’t think the leading project differs from all the other projects that develop tech and keep it private. Like Wei Dai said, insofar as a company can charge people to use tech without letting the secrets of how to build that tech escape, they will obviously do so. I think our disagreement is about your last two sentences, which I quoted above.)