In lieu of coming up with a creative solution to your problem, I will relate how Hannu Rajaniemi solves this problem in the Quantum Thief books, particular for the group called the Sobornost. (Spoilers, obviously.) There are billions (trillions?) of copies of certain individuals, and each copy retains self-copying rights. Each copy knows which agent forked it (who its “copyfather” is), and is programmed to feel “religious awe” and devotion to its specific line of descent. So if you found yourself spawned in this world, you would feel strong awe and obedience for your copyfather, even stronger awe and obedience for your copygrandfather, and ultimate devotion to the “original” digital version of yourself (the “prime”). This policy keeps everyone in line and assists in conflict resolution, because there’s always a hierarchy of authority among the copies. This also allows small groups of copies to go off and pursue a tangential agenda with trust that the agenda will be in line with what the prime individual wanted.
It’s an interesting solution, but the ability to edit the AIs to reliably feel such emotions is rather further in the future than I want to focus on; I want to start out by assuming that the brain-emulations are brute-force black-box emulations of low-level neural processes, and that it’ll take a significant amount of research to get beyond that stage to create more subtle effects.
That said, I /do/ have some notes on the benefits of keeping careful track of which copies “descend” from which, in order to have a well-understood hierarchy to default back onto in case some emergency requires such organization. I’ve even considered having ‘elder’ copies use a set of computational tricks to have kill-switches for their ‘descendants’. But having spent some time thinking about this approach, I suspect that an AI-clan which relied on such a rigid hierarchy for organizing their management structure would be rapidly out-competed by AI-clans that applied post-paleolithic ideas on the matter. (But the effort spent thinking about the hierarchy isn’t wasted, as it can still serve as the default basis for legal inheritance should one copy die, and as a default hierarchy in lifeboat situations with limited resources of the AI-clan hasn’t come up with a better system by then.)
In lieu of coming up with a creative solution to your problem, I will relate how Hannu Rajaniemi solves this problem in the Quantum Thief books, particular for the group called the Sobornost. (Spoilers, obviously.) There are billions (trillions?) of copies of certain individuals, and each copy retains self-copying rights. Each copy knows which agent forked it (who its “copyfather” is), and is programmed to feel “religious awe” and devotion to its specific line of descent. So if you found yourself spawned in this world, you would feel strong awe and obedience for your copyfather, even stronger awe and obedience for your copygrandfather, and ultimate devotion to the “original” digital version of yourself (the “prime”). This policy keeps everyone in line and assists in conflict resolution, because there’s always a hierarchy of authority among the copies. This also allows small groups of copies to go off and pursue a tangential agenda with trust that the agenda will be in line with what the prime individual wanted.
It’s an interesting solution, but the ability to edit the AIs to reliably feel such emotions is rather further in the future than I want to focus on; I want to start out by assuming that the brain-emulations are brute-force black-box emulations of low-level neural processes, and that it’ll take a significant amount of research to get beyond that stage to create more subtle effects.
That said, I /do/ have some notes on the benefits of keeping careful track of which copies “descend” from which, in order to have a well-understood hierarchy to default back onto in case some emergency requires such organization. I’ve even considered having ‘elder’ copies use a set of computational tricks to have kill-switches for their ‘descendants’. But having spent some time thinking about this approach, I suspect that an AI-clan which relied on such a rigid hierarchy for organizing their management structure would be rapidly out-competed by AI-clans that applied post-paleolithic ideas on the matter. (But the effort spent thinking about the hierarchy isn’t wasted, as it can still serve as the default basis for legal inheritance should one copy die, and as a default hierarchy in lifeboat situations with limited resources of the AI-clan hasn’t come up with a better system by then.)