This is an extremely important unsolved question IMO, because a multipolar scenario appears to be where were heading if we can adequately solve alignment in time.
The best I’ve come up with is: Don’t be in a multipolar scenario any more than you absolutely have to. Nonproliferation, lime with nukes seems like the only answer. The best solution to a multipolar scenario is to not let it become any more multipolar than it is, and ultimately make it less multipolar.
The problems you mention seem very bad and it gets worse when you consider that very advanced technology is probably able to save a few of the genocidal AI controllers favorite people, or maybe the mind states of a lot of people, even while wiping out humanity and rival AGIs to provide some control of the future for whatever ideology.
Another possibility I should add is that rival AGIs may resort to mutually assured destruction. Having a dead man switch to crack the earth crust or send the sun nova if you’re not around to stop it would be an extreme measure that could be applied. Sending a copy of yourself off to a nearby star with a stealthy departure would seem like good insurance against a genocidal takeover.
Universql surveillance of earth and the solar system might suffice to prevent hostile exponential military improvements. That might even be done by a neutral AGI that keeps everyone’s secrets as long as they’re not violating a treaty about developing the capacity to kill everyone else.
This is an extremely important unsolved question IMO, because a multipolar scenario appears to be where were heading if we can adequately solve alignment in time.
See if we solve alignment do we die anyway and the discussion and edited conclusion. Even after all of that, I notice I’m still confused.
The best I’ve come up with is: Don’t be in a multipolar scenario any more than you absolutely have to. Nonproliferation, lime with nukes seems like the only answer. The best solution to a multipolar scenario is to not let it become any more multipolar than it is, and ultimately make it less multipolar.
The problems you mention seem very bad and it gets worse when you consider that very advanced technology is probably able to save a few of the genocidal AI controllers favorite people, or maybe the mind states of a lot of people, even while wiping out humanity and rival AGIs to provide some control of the future for whatever ideology.
Another possibility I should add is that rival AGIs may resort to mutually assured destruction. Having a dead man switch to crack the earth crust or send the sun nova if you’re not around to stop it would be an extreme measure that could be applied. Sending a copy of yourself off to a nearby star with a stealthy departure would seem like good insurance against a genocidal takeover.
Universql surveillance of earth and the solar system might suffice to prevent hostile exponential military improvements. That might even be done by a neutral AGI that keeps everyone’s secrets as long as they’re not violating a treaty about developing the capacity to kill everyone else.