The typical CPU is built from n-NOR, n-NAND, and NOT gates. The NOT gates works like a 1-NAND or a 1-NOR (they’re the same thing, electronically). Everything else, including AND and OR, are made from those three. The actual logic only requires NOT and {1 of AND, OR, NAND, NOR}. Notice there are several sets of minimum gates and and a larger set of used gates.
The brain (I’m theorizing now, I have no background in neural chemistry) has a similar set of basic gates that can be organized into a Turing machine, and the gate I described previously is one of them.
The brain (I’m theorizing now, I have no background in neural chemistry) has a similar set of basic gates that can be organized into a Turing machine, and the gate I described previously is one of them.
No. You can represent logic gates using neural circuits, and use them to describe arbitrary finite-state automata that generalize into Turing-complete automata in the limit of infinite size (or by adding an infinite external memory), but that’s not how the brain is organized, and it would be difficult to have any learning in a system constucted in this way.
ANNs don’t begin to scratch the surface of the scale or complexity of the human brain.
Not that they’re not fun as toy models, or useful in their own right, just remember that they are oblivious to all human brain chemistry, and to chemistry in general.
ANNs don’t begin to scratch the surface of the scale or complexity of the human brain.
Of course, but Cube is talking about “a similar set of basic gates that can be organized into a Turing machine” which looks like an ANN more than it looks like wetware.
I’m a computer expert but a brain newbie.
The typical CPU is built from n-NOR, n-NAND, and NOT gates. The NOT gates works like a 1-NAND or a 1-NOR (they’re the same thing, electronically). Everything else, including AND and OR, are made from those three. The actual logic only requires NOT and {1 of AND, OR, NAND, NOR}. Notice there are several sets of minimum gates and and a larger set of used gates.
The brain (I’m theorizing now, I have no background in neural chemistry) has a similar set of basic gates that can be organized into a Turing machine, and the gate I described previously is one of them.
We don’t run on logic gates. We run on noisy differential equations.
No.
You can represent logic gates using neural circuits, and use them to describe arbitrary finite-state automata that generalize into Turing-complete automata in the limit of infinite size (or by adding an infinite external memory), but that’s not how the brain is organized, and it would be difficult to have any learning in a system constucted in this way.
You might want to look into what’s called ANN—artificial neural networks.
ANNs don’t begin to scratch the surface of the scale or complexity of the human brain.
Not that they’re not fun as toy models, or useful in their own right, just remember that they are oblivious to all human brain chemistry, and to chemistry in general.
Of course, but Cube is talking about “a similar set of basic gates that can be organized into a Turing machine” which looks like an ANN more than it looks like wetware.