I enjoyed this primer; and as I value it I was somewhat dismayed to read some of your conclusions in the “what we’ve learned” section about neural coding:
Utilities (in humans) are real numbers ranging from 0 to 1,000 that take action potentials per second as their natural units.
Utilities are encoded cardinally in firing rates relative to neuronal baseline firing rates. (This is opposed to post-Pareto, ordinal notions of utility.)
I’m curious where you got the highly specific “ranging from 0 to 1,000” bit, but moreover it’s misleading to mention rate coding as the be-all end all of neural coding, especially as it is largely out of date. Neural coding has been found to be more complex.
Where the brain needs to encode significant quantitative information quickly, it primarily employs population coding in small local neuron clusters.
With population coding, numbers between 0 to N can be stochastically encoded and sent by a micro-column of N neurons in a short minimal time window of 10ms or less. And in theory population coding can even be far better than that when you consider the considerable connectivity weight variation across neurons in the population. (compare to binary coding where N units can encode 2^N values)
Temporally extending a transmission (rate code) over a longer time period can linearly increase the value range in proportion, but at the expense of reducing the computation rate/bandwidth and much worse—increasing the response latency. Rate coding has not been completely discredited, it certainly plays a role, but computational efficiency considerations rule it out as the main coding method.
I’m curious where you got the highly specific “ranging from 0 to 1,000” bit, but moreover it’s misleading to mention rate coding as the be-all end all of neural coding, especially as it is largely out of date. Neural coding has been found to be more complex.
I don’t think “rate coding” can be called “out of date” when we don’t even have a ballbark estimate of how computation works in the brain. Research hot topics (namely, temporal coding) are not always indicative of progress.
Maybe you don’t understand what rate coding is. Population codes are still rates. The fundamental debate is over whether a population response r(t) contains all the relevant information needed to understand computations performed, or whether the statistical characteristics of the spike patterns themselves carry “extra” information. Here, r(t) is obtained by spike filtering, which removes the finer inter-spike-interval information.
Rate coding has not been completely discredited, it certainly plays a role, but computational efficiency considerations rule it out as the main coding method.
No one in theoretical neuroscience has ever said anything like this.
it’s misleading to mention rate coding as the be-all end all of neural coding, especially as it is largely out of date. Neural coding has been found to be more complex.
I did not “mention rate coding as the be-all end all of neural coding.” There is more to neural coding than rate coding, but rate coding is what predicts choice using our best mathematical models of choice in the final common path of the brain’s choice circuits.
I enjoyed this primer; and as I value it I was somewhat dismayed to read some of your conclusions in the “what we’ve learned” section about neural coding:
I’m curious where you got the highly specific “ranging from 0 to 1,000” bit, but moreover it’s misleading to mention rate coding as the be-all end all of neural coding, especially as it is largely out of date. Neural coding has been found to be more complex.
Where the brain needs to encode significant quantitative information quickly, it primarily employs population coding in small local neuron clusters.
With population coding, numbers between 0 to N can be stochastically encoded and sent by a micro-column of N neurons in a short minimal time window of 10ms or less. And in theory population coding can even be far better than that when you consider the considerable connectivity weight variation across neurons in the population. (compare to binary coding where N units can encode 2^N values)
Temporally extending a transmission (rate code) over a longer time period can linearly increase the value range in proportion, but at the expense of reducing the computation rate/bandwidth and much worse—increasing the response latency. Rate coding has not been completely discredited, it certainly plays a role, but computational efficiency considerations rule it out as the main coding method.
I don’t think “rate coding” can be called “out of date” when we don’t even have a ballbark estimate of how computation works in the brain. Research hot topics (namely, temporal coding) are not always indicative of progress.
Maybe you don’t understand what rate coding is. Population codes are still rates. The fundamental debate is over whether a population response r(t) contains all the relevant information needed to understand computations performed, or whether the statistical characteristics of the spike patterns themselves carry “extra” information. Here, r(t) is obtained by spike filtering, which removes the finer inter-spike-interval information.
No one in theoretical neuroscience has ever said anything like this.
I did not “mention rate coding as the be-all end all of neural coding.” There is more to neural coding than rate coding, but rate coding is what predicts choice using our best mathematical models of choice in the final common path of the brain’s choice circuits.