The complexity of the human genome puts a rough upper bound on how many parameters would be required to specify an AGI (it will have more learned parameters, once deployed). Of course, a superintelligence capable of taking over the world is harder to bound.
One further thing to note is that an alien AI might require a lot of memory and processing power to perform its intended task. As I wrote in the post, this is one reason to suppose that aliens might want to target civilizations after they have achieved a certain level of technological development. Because otherwise their scheme might fail.
The complexity of the human genome puts a rough upper bound on how many parameters would be required to specify an AGI (it will have more learned parameters, once deployed). Of course, a superintelligence capable of taking over the world is harder to bound.
One further thing to note is that an alien AI might require a lot of memory and processing power to perform its intended task. As I wrote in the post, this is one reason to suppose that aliens might want to target civilizations after they have achieved a certain level of technological development. Because otherwise their scheme might fail.