That is a tad too high, the more accurate figure is 10^14 ops/second (10^14 synapses * avg 1 hz spike rate).
I’ve seen various people give estimates in the order of 10^16 flops by considering the maximum firing rate of a typical neuron (~10^2 Hz) rather than the average firing rate, as you do.
On one hand, a neuron must do some computation whether it fires or not, and a “naive” simulation would necessarily use a cycle frequency of the order of 10^2 Hz or more, on the other hand, if the result of a computation is almost always “do not fire”, then as a random variable the result has little information entropy and this may perhaps be exploited to optimize the computation. I don’t have a strong intuition about this.
The minimal computation required to simulate a single GPU in real time is 10,000 times higher.
On a traditional CPU perhaps, on another GPU I don’t think so.
I’ve seen various people give estimates in the order of 10^16 flops by considering the maximum firing rate of a typical neuron (~10^2 Hz) rather than the average firing rate, as you do.
On one hand, a neuron must do some computation whether it fires or not, and a “naive” simulation would necessarily use a cycle frequency of the order of 10^2 Hz or more, on the other hand, if the result of a computation is almost always “do not fire”, then as a random variable the result has little information entropy and this may perhaps be exploited to optimize the computation. I don’t have a strong intuition about this.
On a traditional CPU perhaps, on another GPU I don’t think so.