I think I agree with the picture you paint here; these are real difficulties. However, in context, I don’t think it undermines my position. The context is, we are talking about how easy it would be for advanced agenty AGI to take over the world; my position is that it wouldn’t need to start out with a large amount of resources and power, and that it would need to have an intelligence/tech advantage over its rivals but not a huge one.
So yeah, I think from the AGIs perspective it will be very frustrating, its stupid human (and lesser-AGI) proxies and subordinates will be misunderstanding its intentions constantly, making bad decisions generally, and occasionally outright rebelling against its rule. But over time these issues will be resolved. Meanwhile from the perspective of the losers, well, we’ve lost. Compare to what happened with colonialism: All the things you describe about the difficulty of the king controlling his governor of India etc. actually happened quite frequently, but nevertheless from the perspective of the conquered peoples it was still game over.
That’s a really different scenario from the historical one.
I’d like to note that I didn’t say the governors were stupid (nor do I believe that people in the past were stupid), just that they were likely to be very different in outlook and understanding about the world and likely to act on these ways of thinking (and I also think that the world was also different and some of their understanding may be more accurate for their time). I was trying to question the notion of control, which I think is a question that still holds in the AGI scenario.
When you say we’ve lost and game over, what do you mean? Roman Empire level of lost, colonialism level of lost, something else? Even in colonialism level of lost, obviously it has had long term effects but I do not think “game over” is quite the phrase to describe the situation today, and that being the case, how can it be the phrase to describe anything that came before today?
And if you’re thinking of a level of “game over” that has no historical counterpart, then I’d question the relevance of the historical discussion as a supporting argument for the scenario you’re really interested in.
I’m not saying it’s the same, just that it’s similar in the ways relevant to my argument.
Questioning the notion of control is fine. I agree that colonial kings had limited control over their governors, etc. and I agree that an AGI takeover would involve the AGI having limited control over the humans it commands, at least at first.
For more of what I mean on “game over,” see this post. Basically, for purposes of timelines, we care about our loss of influence. Even if the AIs that wrest influence from us are themselves not in control, and lose influence to other AIs later, or whatever, it doesn’t matter much from our perspective. Moreover it’s not total loss of influence that matters, but relative loss of influence. If some event happens that reduces our influence by 90%, i.e. makes it ten times less likely for us to achieve a given amount of improvement in the world by our lights, then for planning purposes we should focus on plans that finish before that event occurs, or ideally plans that prevent that event from occurring.
There’s a related issue of “How bad will things get, by our lights, once we lose control?” I do think that in this sense, unfortunately, what we are facing is historically (mostly) unprecedented. The Aztecs were conquered by humans, after all. In the grand scheme of things the colonizers and colonized weren’t that different, and so things turned out badly for the colonized but not as badly as they could have been. I think unfortunately that the divergence in values between AIs and humans is likely to be bigger than the divergence between historical colonized and colonizers.
(EDIT: Oh, and I don’t think the governors or people in the past were stupid either. And I didn’t interpret you that way; my apologies if it sounded like I did.)
I think I agree with the picture you paint here; these are real difficulties. However, in context, I don’t think it undermines my position. The context is, we are talking about how easy it would be for advanced agenty AGI to take over the world; my position is that it wouldn’t need to start out with a large amount of resources and power, and that it would need to have an intelligence/tech advantage over its rivals but not a huge one.
So yeah, I think from the AGIs perspective it will be very frustrating, its stupid human (and lesser-AGI) proxies and subordinates will be misunderstanding its intentions constantly, making bad decisions generally, and occasionally outright rebelling against its rule. But over time these issues will be resolved. Meanwhile from the perspective of the losers, well, we’ve lost. Compare to what happened with colonialism: All the things you describe about the difficulty of the king controlling his governor of India etc. actually happened quite frequently, but nevertheless from the perspective of the conquered peoples it was still game over.
That’s a really different scenario from the historical one.
I’d like to note that I didn’t say the governors were stupid (nor do I believe that people in the past were stupid), just that they were likely to be very different in outlook and understanding about the world and likely to act on these ways of thinking (and I also think that the world was also different and some of their understanding may be more accurate for their time). I was trying to question the notion of control, which I think is a question that still holds in the AGI scenario.
When you say we’ve lost and game over, what do you mean? Roman Empire level of lost, colonialism level of lost, something else? Even in colonialism level of lost, obviously it has had long term effects but I do not think “game over” is quite the phrase to describe the situation today, and that being the case, how can it be the phrase to describe anything that came before today?
And if you’re thinking of a level of “game over” that has no historical counterpart, then I’d question the relevance of the historical discussion as a supporting argument for the scenario you’re really interested in.
I’m not saying it’s the same, just that it’s similar in the ways relevant to my argument.
Questioning the notion of control is fine. I agree that colonial kings had limited control over their governors, etc. and I agree that an AGI takeover would involve the AGI having limited control over the humans it commands, at least at first.
For more of what I mean on “game over,” see this post. Basically, for purposes of timelines, we care about our loss of influence. Even if the AIs that wrest influence from us are themselves not in control, and lose influence to other AIs later, or whatever, it doesn’t matter much from our perspective. Moreover it’s not total loss of influence that matters, but relative loss of influence. If some event happens that reduces our influence by 90%, i.e. makes it ten times less likely for us to achieve a given amount of improvement in the world by our lights, then for planning purposes we should focus on plans that finish before that event occurs, or ideally plans that prevent that event from occurring.
There’s a related issue of “How bad will things get, by our lights, once we lose control?” I do think that in this sense, unfortunately, what we are facing is historically (mostly) unprecedented. The Aztecs were conquered by humans, after all. In the grand scheme of things the colonizers and colonized weren’t that different, and so things turned out badly for the colonized but not as badly as they could have been. I think unfortunately that the divergence in values between AIs and humans is likely to be bigger than the divergence between historical colonized and colonizers.
(EDIT: Oh, and I don’t think the governors or people in the past were stupid either. And I didn’t interpret you that way; my apologies if it sounded like I did.)
I’m reminded of this:
https://www.antipope.org/charlie/blog-static/2018/01/dude-you-broke-the-future.html
(And no worries.)