If it turns out that any AI must use a minimum of say 10,000 watts or more, then there is hope for controlling AI creation and distribution long term.
Note, 1 kW (50-100x human brain wattage) is roughly the power consumption of a very beefy desktop PC, and 10 kW is roughly the power consumption of a single rack in a datacenter. Even ~megawatt scale AI (100 racks) could fit pretty easily within many existing datacenters, or within a single entity’s mid-size industrial-scale basement, at only moderate cost.
Yeah, this isn’t enough to stop companies from producing useful AI, but it does mostly mean we can hope to avoid scenarios where single individuals can reliably build AI, meaning that controlling AI in scenarios where individuals, but not companies are the problem for existential risk is possible. It’s also relevant for other questions not focused on existential risk as well.
Note, 1 kW (50-100x human brain wattage) is roughly the power consumption of a very beefy desktop PC, and 10 kW is roughly the power consumption of a single rack in a datacenter. Even ~megawatt scale AI (100 racks) could fit pretty easily within many existing datacenters, or within a single entity’s mid-size industrial-scale basement, at only moderate cost.
Yeah, this isn’t enough to stop companies from producing useful AI, but it does mostly mean we can hope to avoid scenarios where single individuals can reliably build AI, meaning that controlling AI in scenarios where individuals, but not companies are the problem for existential risk is possible. It’s also relevant for other questions not focused on existential risk as well.