The time it takes to get a DSA by growing bigger depends on how big you are to begin with. If I understand, you take your 30 years from considering the largest countries, which are not far from being the size of the world, and then use it when talking about AI projects that are much smaller (e.g. a billion dollars a year suggests about 1⁄100,000 of the world). If you start from a situation of an AI project being three doublings from taking over the world say, then most of the question of how it came to have a DSA seems to be the question of how it grew the other seventeen doublings. (Perhaps you are thinking of an initially large country growing fast via AI? Do we then have to imagine that all of the country’s resources are going into AI?)
I was thinking of an initially large country growing fast via AI, yes. Still counts; it is soft takeoff leading to DSA. However I am also making much stronger claims than that—I think it could happen with a corporation or rogue AGI.
I don’t think annual income is at all a good measure of how close an entity is to taking over the world. When Cortez landed in Mexico he had less than 1⁄100,000th of the income, population, etc. of the region, yet he ruled the whole place three years later. Then a few years after that Pizarro repeated the feat in Peru, good evidence that it wasn’t just an amazing streak of luck.
1) Even if it counts as a DSA, I claim that it is not very interesting in the context of AI. DSAs of something already almost as large as the world are commonplace. For instance, in the extreme, the world minus any particular person could take over the world if they wanted to. The concern with AI is that an initially tiny entity might take over the world.
2) My important point is rather that your ’30 year’ number is specific to the starting size of the thing, and not just a general number for getting a DSA. In particular, it does not apply to smaller things.
3) Agree income doesn’t equal taking over, though in the modern world where much purchasing occurs, it is closer. Not clear to me that AI companies do better as a fraction of the world in terms of military power than they do in terms of spending.
The concern with AI is that an initially tiny entity might take over the world.
This is a concern with AI, but why is it the concern. If eg. the United States could take over the world because they had some AI enabled growth, why would that not be a big deal? I’m imagining you saying, “It’s not unique to AI” but why does it need to be unique? If AI is the root cause of something on the order of Britain colonizing the world in the 19th century, this still seems like it could be concerning if there weren’t any good governing principles established beforehand.
I like your point #2; I should think more about how the 30 year number changes with size. Obviously it’s smaller for bigger entities and bigger for smaller entities, but how much? E.g. if we teleported 2020 Estonia back into 1920, would it be able to take over the world? Probably. What about 1970 though? Less clear.
Military power isn’t what I’m getting at either, at least not if measured in the way that would result in AI companies having little of it. Cortez had, maybe, 1⁄10,000th of the military power of Mexico when he got started. At least if you measure in ways like “What would happen if X fought Y.” Probably 1⁄10,000th of Mexico’s military could have defeated Cortez’ initial band.
If we try to model Cortez’ takeover as him having more of some metric than all of Mexico had, then presumably Spain had several orders of magnitude more of that metric than Cortez did, and Western Europe as a whole had at least an order of magnitude more than that. So Western Europe had *many* orders of magnitude more of this stuff, whatever it is, than Mexico, even though Mexico had a similar population and GDP. So they must have been growing much faster than Mexico for quite some time to build up such a lead—and this was before the industrial revolution! More generally, this metric that is used for predicting takeovers seems to be the sort of thing that can grow and/or shrink orders of magnitude very quickly, as illustrated by the various cases throughout history of small groups from backwater regions taking over rich empires.
(Warning: I’m pulling these claims out of my ass, I’m not a historian, I might be totally wrong. I should look up these numbers.)
The time it takes to get a DSA by growing bigger depends on how big you are to begin with. If I understand, you take your 30 years from considering the largest countries, which are not far from being the size of the world, and then use it when talking about AI projects that are much smaller (e.g. a billion dollars a year suggests about 1⁄100,000 of the world). If you start from a situation of an AI project being three doublings from taking over the world say, then most of the question of how it came to have a DSA seems to be the question of how it grew the other seventeen doublings. (Perhaps you are thinking of an initially large country growing fast via AI? Do we then have to imagine that all of the country’s resources are going into AI?)
I was thinking of an initially large country growing fast via AI, yes. Still counts; it is soft takeoff leading to DSA. However I am also making much stronger claims than that—I think it could happen with a corporation or rogue AGI.
I don’t think annual income is at all a good measure of how close an entity is to taking over the world. When Cortez landed in Mexico he had less than 1⁄100,000th of the income, population, etc. of the region, yet he ruled the whole place three years later. Then a few years after that Pizarro repeated the feat in Peru, good evidence that it wasn’t just an amazing streak of luck.
1) Even if it counts as a DSA, I claim that it is not very interesting in the context of AI. DSAs of something already almost as large as the world are commonplace. For instance, in the extreme, the world minus any particular person could take over the world if they wanted to. The concern with AI is that an initially tiny entity might take over the world.
2) My important point is rather that your ’30 year’ number is specific to the starting size of the thing, and not just a general number for getting a DSA. In particular, it does not apply to smaller things.
3) Agree income doesn’t equal taking over, though in the modern world where much purchasing occurs, it is closer. Not clear to me that AI companies do better as a fraction of the world in terms of military power than they do in terms of spending.
This is a concern with AI, but why is it the concern. If eg. the United States could take over the world because they had some AI enabled growth, why would that not be a big deal? I’m imagining you saying, “It’s not unique to AI” but why does it need to be unique? If AI is the root cause of something on the order of Britain colonizing the world in the 19th century, this still seems like it could be concerning if there weren’t any good governing principles established beforehand.
I like your point #2; I should think more about how the 30 year number changes with size. Obviously it’s smaller for bigger entities and bigger for smaller entities, but how much? E.g. if we teleported 2020 Estonia back into 1920, would it be able to take over the world? Probably. What about 1970 though? Less clear.
Military power isn’t what I’m getting at either, at least not if measured in the way that would result in AI companies having little of it. Cortez had, maybe, 1⁄10,000th of the military power of Mexico when he got started. At least if you measure in ways like “What would happen if X fought Y.” Probably 1⁄10,000th of Mexico’s military could have defeated Cortez’ initial band.
If we try to model Cortez’ takeover as him having more of some metric than all of Mexico had, then presumably Spain had several orders of magnitude more of that metric than Cortez did, and Western Europe as a whole had at least an order of magnitude more than that. So Western Europe had *many* orders of magnitude more of this stuff, whatever it is, than Mexico, even though Mexico had a similar population and GDP. So they must have been growing much faster than Mexico for quite some time to build up such a lead—and this was before the industrial revolution! More generally, this metric that is used for predicting takeovers seems to be the sort of thing that can grow and/or shrink orders of magnitude very quickly, as illustrated by the various cases throughout history of small groups from backwater regions taking over rich empires.
(Warning: I’m pulling these claims out of my ass, I’m not a historian, I might be totally wrong. I should look up these numbers.)