It does kinda make sense to plant the world thick with various AIs and counter-AIs, because that makes it harder for one AI to rise and take over everything.
I’m not sure about that. It makes sense if the AIs stay more or less equal in intelligence and power, similar to humans. But it doesn’t make sense if the strongest AI is to the next powerful like we are to Gorillas, or mice. The problem is that each of the AGIs will have the same instrumental goals of power-seeking and self-improvment, so there will be a race very similar to the race between Google and Microsoft, only much quicker and more fierce. It’s extremely unlikely that they will all grow in power at about the same rate, so one will outpace the others pretty soon. In the end “the winner takes it all”, as they say.
It may be that we’ll find ways to contain AGIs, limit their power-seeking, etc., for a while. But I can’t see how this will remain stable for long. It seems like trying to stop evolution.
I’m not sure about that. It makes sense if the AIs stay more or less equal in intelligence and power, similar to humans. But it doesn’t make sense if the strongest AI is to the next powerful like we are to Gorillas, or mice. The problem is that each of the AGIs will have the same instrumental goals of power-seeking and self-improvment, so there will be a race very similar to the race between Google and Microsoft, only much quicker and more fierce. It’s extremely unlikely that they will all grow in power at about the same rate, so one will outpace the others pretty soon. In the end “the winner takes it all”, as they say.
It may be that we’ll find ways to contain AGIs, limit their power-seeking, etc., for a while. But I can’t see how this will remain stable for long. It seems like trying to stop evolution.