That’s assuming that the brain is using predictive coding to implement backprop, whereas it might instead be doing something that is more computationally efficient given its hardware limitations. (Indeed, the fact that it’s so inefficient should make you update that it’s not likely for the brain to be doing it)
Partly, yes. But partly the computation could be the cheap part compared to the thing it’s trading off against (ability to grow, fault tolerance, …). It is also possible that the brains architecture allows it to include a wider range of inputs that might not be able to model with back-prod (or not efficiently so).
That’s assuming that the brain is using predictive coding to implement backprop, whereas it might instead be doing something that is more computationally efficient given its hardware limitations. (Indeed, the fact that it’s so inefficient should make you update that it’s not likely for the brain to be doing it)
Partly, yes. But partly the computation could be the cheap part compared to the thing it’s trading off against (ability to grow, fault tolerance, …). It is also possible that the brains architecture allows it to include a wider range of inputs that might not be able to model with back-prod (or not efficiently so).