“Cortical neurons are well approximated by a deep neural network (DNN) with 5–8 layers ”
“However, in a full model of an L5 pyramidal neuron consisting of NMDA-based synapses, the complexity of the analogous DNN is significantly increased; we found a good fit to the I/O of this modeled cell when using a TCN (my note: temporally convolutional network) that has five to eight hidden layers ”
For best performance, the width was 256.
Since L5 neurons can perform as small neural nets, this might have implications for the computational power of brains.
According to https://www.sciencedirect.com/science/article/pii/S0896627321005018?dgcid=coauthor
“Cortical neurons are well approximated by a deep neural network (DNN) with 5–8 layers ”
“However, in a full model of an L5 pyramidal neuron consisting of NMDA-based synapses, the complexity of the analogous DNN is significantly increased; we found a good fit to the I/O of this modeled cell when using a TCN (my note: temporally convolutional network) that has five to eight hidden layers ”
For best performance, the width was 256.
Since L5 neurons can perform as small neural nets, this might have implications for the computational power of brains.
Huh, that’s pretty cool, thanks for sharing.