Why limit ourselves to our planet? 12 OOMs is well within reason if we were a type 2 civilization and had access to all the energy from our sun (our planet as a whole only receives 1/10^10 of it).
Encryption wouldn’t really be an issue—we can simply tune our algorithms to use slightly more complicated assumptions. After all, one can just pick a problem that scales as O(10^(6n)), where n could for example be secret key length. If you have 12 orders of magnitude more compute, just make your key 2 times larger and you still have your cryptography.
Thought of how small computers (phones etc) would scale also came to me. Basically, with 12 OOMs every phone becomes a computer powerful enough to train something as complicated as GPT-3. Everyone could carry their own personalized GPT-3 model with them. Actually, this is another way to improve AI performance—reduce the amount of things it needs to model. Training a personalized model specific for one problem would be cheaper and require less parameters/layers to get useful results.
Basically, we would be able to put a “small” specialized model with power like that of GPT-3 on every microcontroller.
You mentioned deep fakes. But with this much compute, why not “deepfake” brand new networks from scratch? Doing such experiments right now is expensive since they roughly quadruple the amount of computational resources needed to achieve this “second order” training mode.
Theoretically, there’s nothing preventing one from constructing a network that can assemble new networks based on training parameters. This meta-network could take into account network structure that it “learned” from training smaller models for other applications in order to generate new models with order of magnitude less parameters.
As an analogy, compare the amount of data a neural network needs to learn to differentiate between cats and dogs vs the amount of data a human needs to learn the same thing. Human only needs a couple of examples, while neural networks needs dozens of hundreds of examples just to learn the concept of shapes and topology.
Why limit ourselves to our planet? 12 OOMs is well within reason if we were a type 2 civilization and had access to all the energy from our sun (our planet as a whole only receives 1/10^10 of it).
Encryption wouldn’t really be an issue—we can simply tune our algorithms to use slightly more complicated assumptions. After all, one can just pick a problem that scales as O(10^(6n)), where n could for example be secret key length. If you have 12 orders of magnitude more compute, just make your key 2 times larger and you still have your cryptography.
Thought of how small computers (phones etc) would scale also came to me. Basically, with 12 OOMs every phone becomes a computer powerful enough to train something as complicated as GPT-3. Everyone could carry their own personalized GPT-3 model with them. Actually, this is another way to improve AI performance—reduce the amount of things it needs to model. Training a personalized model specific for one problem would be cheaper and require less parameters/layers to get useful results.
Basically, we would be able to put a “small” specialized model with power like that of GPT-3 on every microcontroller.
You mentioned deep fakes. But with this much compute, why not “deepfake” brand new networks from scratch? Doing such experiments right now is expensive since they roughly quadruple the amount of computational resources needed to achieve this “second order” training mode.
Theoretically, there’s nothing preventing one from constructing a network that can assemble new networks based on training parameters. This meta-network could take into account network structure that it “learned” from training smaller models for other applications in order to generate new models with order of magnitude less parameters.
As an analogy, compare the amount of data a neural network needs to learn to differentiate between cats and dogs vs the amount of data a human needs to learn the same thing. Human only needs a couple of examples, while neural networks needs dozens of hundreds of examples just to learn the concept of shapes and topology.