I’ve been trying to brand this paradigm as “brain imitation learning” but it hasn’t caught on. The research still continues and we’re seeing exponential increases in neuron recording capabilities and DL models are doing ever better in cracking open the human brain’s neural code*, but this in-between approach is still mostly ignored.
* so IMO the only reason to be less interested in it than a few years ago is if you think pure DL scaling/progress has gone so fast that it’s outpacing even that, which is reasonable but given the imponderables here and the potential for sudden plateaus in scaling or pure DL progress, I think people should still be keeping more of an eye on brain imitation learning than they do.
I’ve been trying to brand this paradigm as “brain imitation learning” but it hasn’t caught on. The research still continues and we’re seeing exponential increases in neuron recording capabilities and DL models are doing ever better in cracking open the human brain’s neural code*, but this in-between approach is still mostly ignored.
* so IMO the only reason to be less interested in it than a few years ago is if you think pure DL scaling/progress has gone so fast that it’s outpacing even that, which is reasonable but given the imponderables here and the potential for sudden plateaus in scaling or pure DL progress, I think people should still be keeping more of an eye on brain imitation learning than they do.