I hate the term “Neural Network”, as do many serious people working in the field.
There are Perceptrons which were inspired by neurons but are quite different. There are other related techniques that optimize in various ways. There are real neurons which are very complex and rather arbitrary. And then there is the greatly simplified Integrate and Fire (IF) abstraction of a neuron, often with Hebbian learning added.
Perceptrons solve practical problems, but are not the answer to everything as some would have you believe. There are new and powerful kernal methods that can automatically condition data which extend perceptrons. There are many other algorithms such as learning hidden Markov models. IF neurons are used to try and understand brain functionality, but are not useful for solving real problems (far too computationally expensive for what they do).
Which one of these quite different technologies is being referred to as “Neural Network”?
The idea of wiring perceptrons back onto themselves with state is old. Perceptrons have been shown to be able to emulate just about any function, so yes, they would be Turing complete. Being able to learn meanginful weights for such “recurrent” networks is relatively recent (1990s?).
I hate the term “Neural Network”, as do many serious people working in the field.
There are Perceptrons which were inspired by neurons but are quite different. There are other related techniques that optimize in various ways. There are real neurons which are very complex and rather arbitrary. And then there is the greatly simplified Integrate and Fire (IF) abstraction of a neuron, often with Hebbian learning added.
Perceptrons solve practical problems, but are not the answer to everything as some would have you believe. There are new and powerful kernal methods that can automatically condition data which extend perceptrons. There are many other algorithms such as learning hidden Markov models. IF neurons are used to try and understand brain functionality, but are not useful for solving real problems (far too computationally expensive for what they do).
Which one of these quite different technologies is being referred to as “Neural Network”?
The idea of wiring perceptrons back onto themselves with state is old. Perceptrons have been shown to be able to emulate just about any function, so yes, they would be Turing complete. Being able to learn meanginful weights for such “recurrent” networks is relatively recent (1990s?).
I’d think that deep neural networks as here with e.g. backprogation thru time/BPP are meant.