However coming to a solution of a system of agents that self maintains this property with no “super agent” might lead to solutions for AGI alignment, or might prevent the creation of such a misaligned agent.
I doubt that because intelligence explosions or their leadups make things local.
I doubt that because intelligence explosions or their leadups make things local.