Way off. Let’s see… I would bet at even odds that it is 4 or more orders of magnitude off optimal.
you didn’t answer my question: what is your guess at minimum bit representation of a human equi mind?
you didn’t use the typical methodology of measuring the brain’s storage, nor did you provide another.
I wasn’t talking about molecular level optimization. I started with the typical assumption that synapses represent a few bits, the human brain has around 100TB to 1PB of data/circuitry, etc etc—see the singularity is near.
So you say the human brain algorithmic representation is off by 4 orders of magnitude or more—you are saying that you think a human equivalent mind can be represented in 10 to 100GB of data/circuitry?
If so, why did evolution not find that by now? It has had plenty of time to compress at the circuit level. In fact, we actually know that the brain does perform provably optimal compression on its input data in a couple of domains—see V1 and its evolution into gabor-like edge feature detection.
Evolution has had plenty of time to find a well-optimized cellular machinery based on DNA, plenty of time to find a well-optimized electro-chemical computing machinery based on top of that, and plenty of time to find well-optimized circuits within that space.
Even insects are extremely well-optimized at the circuit level—given their neuron/synapse counts, we have no evidence whatsoever to believe that vastly simpler circuits exist that can perform the same functionality.
When we have used evolutionary exploration algorithms to design circuits natively, given enough time we see similar complex, messy, but near optimal designs, and this is a general trend.
you didn’t answer my question: what is your guess at minimum bit representation of a human equi mind?
you didn’t use the typical methodology of measuring the brain’s storage, nor did you provide another.
I wasn’t talking about molecular level optimization. I started with the typical assumption that synapses represent a few bits, the human brain has around 100TB to 1PB of data/circuitry, etc etc—see the singularity is near.
So you say the human brain algorithmic representation is off by 4 orders of magnitude or more—you are saying that you think a human equivalent mind can be represented in 10 to 100GB of data/circuitry?
If so, why did evolution not find that by now? It has had plenty of time to compress at the circuit level. In fact, we actually know that the brain does perform provably optimal compression on its input data in a couple of domains—see V1 and its evolution into gabor-like edge feature detection.
Evolution has had plenty of time to find a well-optimized cellular machinery based on DNA, plenty of time to find a well-optimized electro-chemical computing machinery based on top of that, and plenty of time to find well-optimized circuits within that space.
Even insects are extremely well-optimized at the circuit level—given their neuron/synapse counts, we have no evidence whatsoever to believe that vastly simpler circuits exist that can perform the same functionality.
When we have used evolutionary exploration algorithms to design circuits natively, given enough time we see similar complex, messy, but near optimal designs, and this is a general trend.