This one is optimized more for acceptance than alignment, but it should get funding/acceptance/implementation by decisionmakers.
Instead of having it defeat a foreign adversary totally, have it put the host nation in a more advantageous position, while still being somewhat acceptable to the foreign adversary nation. This involves modelling a lot of serious people, modelling them all simultaneously in their ordinary interconnected environment, and emphasize the preservation of the way all their values fit together. If the host nation gets what they want, that’s a lot of people to model, at least a dozen.
More importantly, it will be taken seriously by the type of people who have the final say over what gets implemented. “Win the war” is something that an AGI can do, and maybe even according to the preferences of the people in charge.
This one is optimized more for acceptance than alignment, but it should get funding/acceptance/implementation by decisionmakers.
Instead of having it defeat a foreign adversary totally, have it put the host nation in a more advantageous position, while still being somewhat acceptable to the foreign adversary nation. This involves modelling a lot of serious people, modelling them all simultaneously in their ordinary interconnected environment, and emphasize the preservation of the way all their values fit together. If the host nation gets what they want, that’s a lot of people to model, at least a dozen.
More importantly, it will be taken seriously by the type of people who have the final say over what gets implemented. “Win the war” is something that an AGI can do, and maybe even according to the preferences of the people in charge.