The government realizes that AI will be decisive for national power and locks down the AGI companies in late 2026. This takes the form of extreme government oversight bordering on nationalization. Progress stays at a similar pace because of the race with other nuclear weapons states
if multiple nuclear states started taking ASI relatively very seriously (i.e. apart from being ignorant of alignment, and being myopically focused on state power instead of utopia/moral good), and started racing, any state behind in that race could threaten to nuke any state which continues to try to bring about ASI. in other words, the current Mutually Assured Destruction can be unilaterally extended to trigger in response to things other than some state firing nukes.
this is (1) a possible out as it would halt large-scale ASI development, and (2) could happen unilaterally by cause of states myopically seeking dominance.
however, if the nuclear states in question just think ASI will be a very good labor automator, then maybe none would be willing to do nuclear war over it, even if it would technically be in the interest of the myopic ‘state power’ goal[1]. i don’t know. (so maybe (2) needs a minimum of seriousness higher than ‘it will automate lots of labor’ but lower than ‘it is a probable extinction risk’)
“(??? why?)” by which i mean it seems absurd/perplexing (however likely) that people would be so myopic. ‘state i was born in having dominance’ is such an alien goal also.)
if multiple nuclear states started taking ASI relatively very seriously (i.e. apart from being ignorant of alignment, and being myopically focused on state power instead of utopia/moral good), and started racing, any state behind in that race could threaten to nuke any state which continues to try to bring about ASI. in other words, the current Mutually Assured Destruction can be unilaterally extended to trigger in response to things other than some state firing nukes.
this is (1) a possible out as it would halt large-scale ASI development, and (2) could happen unilaterally by cause of states myopically seeking dominance.
however, if the nuclear states in question just think ASI will be a very good labor automator, then maybe none would be willing to do nuclear war over it, even if it would technically be in the interest of the myopic ‘state power’ goal[1]. i don’t know. (so maybe (2) needs a minimum of seriousness higher than ‘it will automate lots of labor’ but lower than ‘it is a probable extinction risk’)
“(??? why?)” by which i mean it seems absurd/perplexing (however likely) that people would be so myopic. ‘state i was born in having dominance’ is such an alien goal also.)