Is there some more general limit to power begetting power that would also affect AGI?
The only one which immediately comes to mind is inflexibility. Often companies shrink or fail entirely because they’re suddenly subject to competition armed with a new idea. Why do the new ideas end up implemented by smaller competitors? The positive feedback of “larger companies have more people who can think of new ideas” is dominated by the negative feedbacks of “even the largest company is tiny compared to its complement” and “companies develop monocultures where everyone thinks the same way” and “companies tend to internally suppress new ideas which would devalue the company’s existing assets”.
Any AGI that doesn’t already have half the world’s brainpower would be subject to the first limitation (which may mean any AGI who hasn’t taken over half the world or just any AGI less than an hour old, depending on how much “foom” the real world turns out to allow, I admit), and an AGI that propagates by self-copying might even be more affected than humans by the second limitation. Whether an AGI was subject to the third limitation or not would depend on its aspirations, I suppose; anything with a “conquer the world” endgame would hardly let itself get trapped in the “just stretch out current revenues as long as possible” mindset of Blockbuster-vs-Netflix...
The only one which immediately comes to mind is inflexibility. Often companies shrink or fail entirely because they’re suddenly subject to competition armed with a new idea. Why do the new ideas end up implemented by smaller competitors? The positive feedback of “larger companies have more people who can think of new ideas” is dominated by the negative feedbacks of “even the largest company is tiny compared to its complement” and “companies develop monocultures where everyone thinks the same way” and “companies tend to internally suppress new ideas which would devalue the company’s existing assets”.
Any AGI that doesn’t already have half the world’s brainpower would be subject to the first limitation (which may mean any AGI who hasn’t taken over half the world or just any AGI less than an hour old, depending on how much “foom” the real world turns out to allow, I admit), and an AGI that propagates by self-copying might even be more affected than humans by the second limitation. Whether an AGI was subject to the third limitation or not would depend on its aspirations, I suppose; anything with a “conquer the world” endgame would hardly let itself get trapped in the “just stretch out current revenues as long as possible” mindset of Blockbuster-vs-Netflix...