I agree with all this. However, you wouldn’t need to get anywhere close to modern technological levels to take over the world; all you need is technology that is superior to whatever your enemies have. And you don’t even need this technology to be distributed throughout the regions you control; you just need it for your military, and not even that—just for your most elite troops.
What level of communication technology do you consider necessary? It seems that in order to control the whole world, you’d need a pretty high level of communication technology.
I’m not sure. I wouldn’t be surprised if it turns out that to control the whole world you need a certain level of communication technology, and that level wasn’t reached until recently. However, my current guess is that communication tech would not be a constraint, because colonialism. European empires were able to maintain colonies all around the globe despite not being able to send messenges faster than sailing ships. I think having some form of writing/reading is probably crucial though. And probably clay tablets aren’t enough; you probably need paper or something like it. Maybe even a printing press, but I’m not sure about that.
I’m not sure that works when you’re not starting with a unified culture. Long term long distance delegation may only work if everyone shares the same doctrine, more or less, went to the same schools, has similar ideas about what the goal is, has similar loyalties, etc. You install a governor in India and you have some idea of what sorts of things they may do, because you know how they were brought up.
But if you write to a subordinate “Sounds like we need to increase moral” and the next thing you hear about they’ve killed a lot of people and put their heads up on poles at the city gate, and you’re like (via letter, which they may get months later) no no no why did you do that and half a year later you get back a bewildered reply where they’re like well obviously we had to for whatever reason made sense to them as a product of that era and location? I mean, maybe you don’t care as long as you’re “in charge”, but maybe you do care?
And maybe some of the unexpected things they do involve selling your more advanced technology to your enemies, or starting wars with your other governors, or...
(I’m not sure any amount of communication technology makes up for this, but more would be better, even though you can’t micromanage the whole world either.)
And if you have complicated plans for what to do with the world once you control it...well, good luck? Also, start founding schools and training teachers, but expect anything you try to instill to mutate at least somewhat, or for portions of it to be ignored entirely because it doesn’t make sense to the teachers, much less the students, or...
Though I guess I’m not sure what degree of control you’re really aiming for. The Roman Empire involved a lot of local rule, for example, so if what you really want is taxes and maybe some form of conscription… (Though they also had their provincial governors looking out for Rome and with the same basic idea of what looking out for Rome meant...)
I think I agree with the picture you paint here; these are real difficulties. However, in context, I don’t think it undermines my position. The context is, we are talking about how easy it would be for advanced agenty AGI to take over the world; my position is that it wouldn’t need to start out with a large amount of resources and power, and that it would need to have an intelligence/tech advantage over its rivals but not a huge one.
So yeah, I think from the AGIs perspective it will be very frustrating, its stupid human (and lesser-AGI) proxies and subordinates will be misunderstanding its intentions constantly, making bad decisions generally, and occasionally outright rebelling against its rule. But over time these issues will be resolved. Meanwhile from the perspective of the losers, well, we’ve lost. Compare to what happened with colonialism: All the things you describe about the difficulty of the king controlling his governor of India etc. actually happened quite frequently, but nevertheless from the perspective of the conquered peoples it was still game over.
That’s a really different scenario from the historical one.
I’d like to note that I didn’t say the governors were stupid (nor do I believe that people in the past were stupid), just that they were likely to be very different in outlook and understanding about the world and likely to act on these ways of thinking (and I also think that the world was also different and some of their understanding may be more accurate for their time). I was trying to question the notion of control, which I think is a question that still holds in the AGI scenario.
When you say we’ve lost and game over, what do you mean? Roman Empire level of lost, colonialism level of lost, something else? Even in colonialism level of lost, obviously it has had long term effects but I do not think “game over” is quite the phrase to describe the situation today, and that being the case, how can it be the phrase to describe anything that came before today?
And if you’re thinking of a level of “game over” that has no historical counterpart, then I’d question the relevance of the historical discussion as a supporting argument for the scenario you’re really interested in.
I’m not saying it’s the same, just that it’s similar in the ways relevant to my argument.
Questioning the notion of control is fine. I agree that colonial kings had limited control over their governors, etc. and I agree that an AGI takeover would involve the AGI having limited control over the humans it commands, at least at first.
For more of what I mean on “game over,” see this post. Basically, for purposes of timelines, we care about our loss of influence. Even if the AIs that wrest influence from us are themselves not in control, and lose influence to other AIs later, or whatever, it doesn’t matter much from our perspective. Moreover it’s not total loss of influence that matters, but relative loss of influence. If some event happens that reduces our influence by 90%, i.e. makes it ten times less likely for us to achieve a given amount of improvement in the world by our lights, then for planning purposes we should focus on plans that finish before that event occurs, or ideally plans that prevent that event from occurring.
There’s a related issue of “How bad will things get, by our lights, once we lose control?” I do think that in this sense, unfortunately, what we are facing is historically (mostly) unprecedented. The Aztecs were conquered by humans, after all. In the grand scheme of things the colonizers and colonized weren’t that different, and so things turned out badly for the colonized but not as badly as they could have been. I think unfortunately that the divergence in values between AIs and humans is likely to be bigger than the divergence between historical colonized and colonizers.
(EDIT: Oh, and I don’t think the governors or people in the past were stupid either. And I didn’t interpret you that way; my apologies if it sounded like I did.)
I agree with all this. However, you wouldn’t need to get anywhere close to modern technological levels to take over the world; all you need is technology that is superior to whatever your enemies have. And you don’t even need this technology to be distributed throughout the regions you control; you just need it for your military, and not even that—just for your most elite troops.
What level of communication technology do you consider necessary? It seems that in order to control the whole world, you’d need a pretty high level of communication technology.
I’m not sure. I wouldn’t be surprised if it turns out that to control the whole world you need a certain level of communication technology, and that level wasn’t reached until recently. However, my current guess is that communication tech would not be a constraint, because colonialism. European empires were able to maintain colonies all around the globe despite not being able to send messenges faster than sailing ships. I think having some form of writing/reading is probably crucial though. And probably clay tablets aren’t enough; you probably need paper or something like it. Maybe even a printing press, but I’m not sure about that.
I’m not sure that works when you’re not starting with a unified culture. Long term long distance delegation may only work if everyone shares the same doctrine, more or less, went to the same schools, has similar ideas about what the goal is, has similar loyalties, etc. You install a governor in India and you have some idea of what sorts of things they may do, because you know how they were brought up.
But if you write to a subordinate “Sounds like we need to increase moral” and the next thing you hear about they’ve killed a lot of people and put their heads up on poles at the city gate, and you’re like (via letter, which they may get months later) no no no why did you do that and half a year later you get back a bewildered reply where they’re like well obviously we had to for whatever reason made sense to them as a product of that era and location? I mean, maybe you don’t care as long as you’re “in charge”, but maybe you do care?
And maybe some of the unexpected things they do involve selling your more advanced technology to your enemies, or starting wars with your other governors, or...
(I’m not sure any amount of communication technology makes up for this, but more would be better, even though you can’t micromanage the whole world either.)
And if you have complicated plans for what to do with the world once you control it...well, good luck? Also, start founding schools and training teachers, but expect anything you try to instill to mutate at least somewhat, or for portions of it to be ignored entirely because it doesn’t make sense to the teachers, much less the students, or...
Though I guess I’m not sure what degree of control you’re really aiming for. The Roman Empire involved a lot of local rule, for example, so if what you really want is taxes and maybe some form of conscription… (Though they also had their provincial governors looking out for Rome and with the same basic idea of what looking out for Rome meant...)
I think I agree with the picture you paint here; these are real difficulties. However, in context, I don’t think it undermines my position. The context is, we are talking about how easy it would be for advanced agenty AGI to take over the world; my position is that it wouldn’t need to start out with a large amount of resources and power, and that it would need to have an intelligence/tech advantage over its rivals but not a huge one.
So yeah, I think from the AGIs perspective it will be very frustrating, its stupid human (and lesser-AGI) proxies and subordinates will be misunderstanding its intentions constantly, making bad decisions generally, and occasionally outright rebelling against its rule. But over time these issues will be resolved. Meanwhile from the perspective of the losers, well, we’ve lost. Compare to what happened with colonialism: All the things you describe about the difficulty of the king controlling his governor of India etc. actually happened quite frequently, but nevertheless from the perspective of the conquered peoples it was still game over.
That’s a really different scenario from the historical one.
I’d like to note that I didn’t say the governors were stupid (nor do I believe that people in the past were stupid), just that they were likely to be very different in outlook and understanding about the world and likely to act on these ways of thinking (and I also think that the world was also different and some of their understanding may be more accurate for their time). I was trying to question the notion of control, which I think is a question that still holds in the AGI scenario.
When you say we’ve lost and game over, what do you mean? Roman Empire level of lost, colonialism level of lost, something else? Even in colonialism level of lost, obviously it has had long term effects but I do not think “game over” is quite the phrase to describe the situation today, and that being the case, how can it be the phrase to describe anything that came before today?
And if you’re thinking of a level of “game over” that has no historical counterpart, then I’d question the relevance of the historical discussion as a supporting argument for the scenario you’re really interested in.
I’m not saying it’s the same, just that it’s similar in the ways relevant to my argument.
Questioning the notion of control is fine. I agree that colonial kings had limited control over their governors, etc. and I agree that an AGI takeover would involve the AGI having limited control over the humans it commands, at least at first.
For more of what I mean on “game over,” see this post. Basically, for purposes of timelines, we care about our loss of influence. Even if the AIs that wrest influence from us are themselves not in control, and lose influence to other AIs later, or whatever, it doesn’t matter much from our perspective. Moreover it’s not total loss of influence that matters, but relative loss of influence. If some event happens that reduces our influence by 90%, i.e. makes it ten times less likely for us to achieve a given amount of improvement in the world by our lights, then for planning purposes we should focus on plans that finish before that event occurs, or ideally plans that prevent that event from occurring.
There’s a related issue of “How bad will things get, by our lights, once we lose control?” I do think that in this sense, unfortunately, what we are facing is historically (mostly) unprecedented. The Aztecs were conquered by humans, after all. In the grand scheme of things the colonizers and colonized weren’t that different, and so things turned out badly for the colonized but not as badly as they could have been. I think unfortunately that the divergence in values between AIs and humans is likely to be bigger than the divergence between historical colonized and colonizers.
(EDIT: Oh, and I don’t think the governors or people in the past were stupid either. And I didn’t interpret you that way; my apologies if it sounded like I did.)
I’m reminded of this:
https://www.antipope.org/charlie/blog-static/2018/01/dude-you-broke-the-future.html
(And no worries.)