Because of the large amount of compute required to create AGI, governments creating strict regulation to prevent AGI from being created. Of course, the compute to create AGI probably goes down every year, but this buys lots of time
Military-backed hackers can effortlessly get access/hijack compute elsewhere, which means that state-backed AI development is not going to be constrained by regulation at all, at least not anything like that. This is one of the big reasons why EY has made high-profile statements about the concept of eliminating all compute, even though that concept is considered heretical by all the decisionmakers in the AI domain.
It’s also the only reason why people talk about “slowing down AI progress” through sweeping, stifling industry regulations, instead of banning specific kinds of AI, although that is actually even more heretical; because it could conceivably happen in English-speaking countries without an agreement that successfully sets up enduring regulation in Russia and China. Trust problems in the international area is already astronomically complex by default, because there are large numbers of agents (e.g. spies) and they inherently strive for maximum nontransparency and information asymmetry.
Military-backed hackers can effortlessly get access/hijack compute elsewhere, which means that state-backed AI development is not going to be constrained by regulation at all, at least not anything like that. This is one of the big reasons why EY has made high-profile statements about the concept of eliminating all compute, even though that concept is considered heretical by all the decisionmakers in the AI domain.
It’s also the only reason why people talk about “slowing down AI progress” through sweeping, stifling industry regulations, instead of banning specific kinds of AI, although that is actually even more heretical; because it could conceivably happen in English-speaking countries without an agreement that successfully sets up enduring regulation in Russia and China. Trust problems in the international area is already astronomically complex by default, because there are large numbers of agents (e.g. spies) and they inherently strive for maximum nontransparency and information asymmetry.