If you are assuming that a neuron contributes less than 2 bits of state (or 1 bit per 500 synapses) and 1 computation per cycle, then you know more about neurobiology than anyone alive.
I didn’t say anything in my post above about the per neuron state—because it’s not important. Each neuron is a low precision analog accumulator, roughly up to 8-10 bits ish, and there are 20 billion neurons in the cortex. There are another 80 billion in the cerebellum, but they are unimportant.
The memory cost of storing the state for an equivalent ANN is far less than than 20 billion bytes or so, because of compression—most of that state is just zero most of the time.
In terms of computation per neuron per cycle, when a neuron fires it does #fanout computations. Counting from the total synapse numbers is easier than estimating neurons * avg fanout, but gives the same results.
When a neuron doesn’t fire .. .it doesn’t compute anything of significance. This is true in the brain and in all spiking ANNs, as it’s equivalent to sparse matrix operations—where the computational cost depends on the number of nonzeros, not the raw size.
If you are assuming that a neuron contributes less than 2 bits of state (or 1 bit per 500 synapses) and 1 computation per cycle, then you know more about neurobiology than anyone alive.
I don’t understand your statement.
I didn’t say anything in my post above about the per neuron state—because it’s not important. Each neuron is a low precision analog accumulator, roughly up to 8-10 bits ish, and there are 20 billion neurons in the cortex. There are another 80 billion in the cerebellum, but they are unimportant.
The memory cost of storing the state for an equivalent ANN is far less than than 20 billion bytes or so, because of compression—most of that state is just zero most of the time.
In terms of computation per neuron per cycle, when a neuron fires it does #fanout computations. Counting from the total synapse numbers is easier than estimating neurons * avg fanout, but gives the same results.
When a neuron doesn’t fire .. .it doesn’t compute anything of significance. This is true in the brain and in all spiking ANNs, as it’s equivalent to sparse matrix operations—where the computational cost depends on the number of nonzeros, not the raw size.