Intelligence requires exploring and simulating a large circuit space—ie by using something like gradient descent on neural networks. You can use a GPU to do that inefficiently or you can create custom nanotech analog hardware like the brain.
The brain emulates circuits, and current AI systems on GPUs simulate circuits inspired by the brain’s emulation.
Intelligence requires exploring and simulating a large circuit space—ie by using something like gradient descent on neural networks.
I don’t think neuroplasticity is equivalent to architecting and then doing gradient descent on an artificial neural network. That process is more analogous to billions of years of evolution, which encoded most of the “circuit exploration” process in DNA. In the brain, some of the weights and even connections are adjusted at “runtime”, but the rules for making those connections are necessarily encoded in DNA.
(Also, I flatly don’t buy that any of this is required for intelligence.)
Intelligence requires exploring and simulating a large circuit space—ie by using something like gradient descent on neural networks. You can use a GPU to do that inefficiently or you can create custom nanotech analog hardware like the brain.
The brain emulates circuits, and current AI systems on GPUs simulate circuits inspired by the brain’s emulation.
I don’t think neuroplasticity is equivalent to architecting and then doing gradient descent on an artificial neural network. That process is more analogous to billions of years of evolution, which encoded most of the “circuit exploration” process in DNA. In the brain, some of the weights and even connections are adjusted at “runtime”, but the rules for making those connections are necessarily encoded in DNA.
(Also, I flatly don’t buy that any of this is required for intelligence.)