No, it’s proportional to the log of the number of patterns that can be (semi-stably) stored. E.g. n bits can store 2^n patterns.
Oops! Correct. That’s what I was thinking, which is why I said info NlogN for N neurons. N neurons ⇒ max N^2 connections, 1 bit per connection, max N^2 bits, simplest model.
The math trying to estimate the number of patterns that can be stored in different neural networks is horrendous. I’ve seen “proofs” for Hopfield network capacity ranging from, I think, N/logN to NlogN.
Anyway, it’s more-than-proportional to N, if for no other reason than that the number of connections per neuron is related to the number of neurons. A human neuron has about 10,000 connections to other neurons. Ant neurons don’t.
Oops! Correct. That’s what I was thinking, which is why I said info NlogN for N neurons. N neurons ⇒ max N^2 connections, 1 bit per connection, max N^2 bits, simplest model.
The math trying to estimate the number of patterns that can be stored in different neural networks is horrendous. I’ve seen “proofs” for Hopfield network capacity ranging from, I think, N/logN to NlogN.
Anyway, it’s more-than-proportional to N, if for no other reason than that the number of connections per neuron is related to the number of neurons. A human neuron has about 10,000 connections to other neurons. Ant neurons don’t.