I think that’s premature. This is just one (digital, synchronous) implementation of one model of BNN that can be shown to converge on the same result as backprop. In a neuromorphic implementation of this circuit, the convergence would occur on the same time scale as the forward propagation.
Well, another advantage of the BNN is of course the high parallelism. But that doesn’t change the computational cost (number of FLOPS required), it just spreads it out in parallel.
I think that’s premature. This is just one (digital, synchronous) implementation of one model of BNN that can be shown to converge on the same result as backprop. In a neuromorphic implementation of this circuit, the convergence would occur on the same time scale as the forward propagation.
Well, another advantage of the BNN is of course the high parallelism. But that doesn’t change the computational cost (number of FLOPS required), it just spreads it out in parallel.