If you think all AGIs will coordinate with each other, nobody needs an edge. If you think humans will build lots of AI systems, many technically unable to coordinate with each other (from mechanisms similar to firewalls/​myopia/​sparsity) then the world takeover requires an edge. An edge such that the (coalition of hostile AIs working together) wins the war against (humans plus their AIs).
This can get interesting if you think there might be diminishing returns in intelligence, which could mean that the (humans + their AI) faction might have a large advantage if the humans start with far more resources like they control now.
If you think all AGIs will coordinate with each other, nobody needs an edge. If you think humans will build lots of AI systems, many technically unable to coordinate with each other (from mechanisms similar to firewalls/​myopia/​sparsity) then the world takeover requires an edge. An edge such that the (coalition of hostile AIs working together) wins the war against (humans plus their AIs).
This can get interesting if you think there might be diminishing returns in intelligence, which could mean that the (humans + their AI) faction might have a large advantage if the humans start with far more resources like they control now.