Lets make the question more specific: whats the minimum bit representation of a human-equivalent mind?
Way off. Let’s see… I would bet at even odds that it is 4 or more orders of magnitude off optimal.
If you think the brain is far off that, how do you justify that?
We have approximately one hundred billion neurons each and roughly the same number of glial cells (more of the latter if we are smart!). Each of those includes a full copy of our DNA, which is itself not exactly optimally compressed.
Way off. Let’s see… I would bet at even odds that it is 4 or more orders of magnitude off optimal.
you didn’t answer my question: what is your guess at minimum bit representation of a human equi mind?
you didn’t use the typical methodology of measuring the brain’s storage, nor did you provide another.
I wasn’t talking about molecular level optimization. I started with the typical assumption that synapses represent a few bits, the human brain has around 100TB to 1PB of data/circuitry, etc etc—see the singularity is near.
So you say the human brain algorithmic representation is off by 4 orders of magnitude or more—you are saying that you think a human equivalent mind can be represented in 10 to 100GB of data/circuitry?
If so, why did evolution not find that by now? It has had plenty of time to compress at the circuit level. In fact, we actually know that the brain does perform provably optimal compression on its input data in a couple of domains—see V1 and its evolution into gabor-like edge feature detection.
Evolution has had plenty of time to find a well-optimized cellular machinery based on DNA, plenty of time to find a well-optimized electro-chemical computing machinery based on top of that, and plenty of time to find well-optimized circuits within that space.
Even insects are extremely well-optimized at the circuit level—given their neuron/synapse counts, we have no evidence whatsoever to believe that vastly simpler circuits exist that can perform the same functionality.
When we have used evolutionary exploration algorithms to design circuits natively, given enough time we see similar complex, messy, but near optimal designs, and this is a general trend.
Are you saying that you are counting every copy of the DNA as information that contributes to the total amount? If so, I say that’s invalid. What if each cell were remotely controlled from a central server containing the DNA information? I can’t see that we’d count the DNA for each cell then—yet it is no different really.
I agree that the number of cells is relevant, because there will be a lot of information in the structure of an adult brain that has come from the environment, rather than just from the DNA, and more cells would seem to imply more machinery in which to put it.
Are you saying that you are counting every copy of the DNA as information that contributes to the total amount? If so, I say that’s invalid. What if each cell were remotely controlled from a central server containing the DNA information? I can’t see that we’d count the DNA for each cell then—yet it is no different really.
I thought we were talking about the efficiency of the human brain. Wasn’t that the whole point? If every cell is remotely controlled from a central server then well, that’d be whole different algorithm. In fact, we could probably scrap the brain and just run the central server.
Genes actually do matter in the functioning of neurons. Chemical additions (eg. ethanol) and changes in the environment (eg. hypoxia) can influence gene expression in cells in the brain, impacting on their function.
I suggest the brain is a ridiculously inefficient contraption thrown together by the building blocks that were practical for production from DNA representations and suitable for the kind of environments animals tended to be exposed to. We should be shocked to find that it also manages to be anywhere near optimal for general intelligence. Among other things it would suggest that evolution packed the wrong lunch.
Okay, I may have misunderstood you. It looks like there is some common ground between us on the issue of inefficiency. I think the brain would probably be inefficient as well as it has to be thrown together by the very specific kind of process of evolution—which is optimized for building things without needing look-ahead intelligence rather than achieving the most efficient results.
Way off. Let’s see… I would bet at even odds that it is 4 or more orders of magnitude off optimal.
We have approximately one hundred billion neurons each and roughly the same number of glial cells (more of the latter if we are smart!). Each of those includes a full copy of our DNA, which is itself not exactly optimally compressed.
you didn’t answer my question: what is your guess at minimum bit representation of a human equi mind?
you didn’t use the typical methodology of measuring the brain’s storage, nor did you provide another.
I wasn’t talking about molecular level optimization. I started with the typical assumption that synapses represent a few bits, the human brain has around 100TB to 1PB of data/circuitry, etc etc—see the singularity is near.
So you say the human brain algorithmic representation is off by 4 orders of magnitude or more—you are saying that you think a human equivalent mind can be represented in 10 to 100GB of data/circuitry?
If so, why did evolution not find that by now? It has had plenty of time to compress at the circuit level. In fact, we actually know that the brain does perform provably optimal compression on its input data in a couple of domains—see V1 and its evolution into gabor-like edge feature detection.
Evolution has had plenty of time to find a well-optimized cellular machinery based on DNA, plenty of time to find a well-optimized electro-chemical computing machinery based on top of that, and plenty of time to find well-optimized circuits within that space.
Even insects are extremely well-optimized at the circuit level—given their neuron/synapse counts, we have no evidence whatsoever to believe that vastly simpler circuits exist that can perform the same functionality.
When we have used evolutionary exploration algorithms to design circuits natively, given enough time we see similar complex, messy, but near optimal designs, and this is a general trend.
Are you saying that you are counting every copy of the DNA as information that contributes to the total amount? If so, I say that’s invalid. What if each cell were remotely controlled from a central server containing the DNA information? I can’t see that we’d count the DNA for each cell then—yet it is no different really.
I agree that the number of cells is relevant, because there will be a lot of information in the structure of an adult brain that has come from the environment, rather than just from the DNA, and more cells would seem to imply more machinery in which to put it.
I thought we were talking about the efficiency of the human brain. Wasn’t that the whole point? If every cell is remotely controlled from a central server then well, that’d be whole different algorithm. In fact, we could probably scrap the brain and just run the central server.
Genes actually do matter in the functioning of neurons. Chemical additions (eg. ethanol) and changes in the environment (eg. hypoxia) can influence gene expression in cells in the brain, impacting on their function.
I suggest the brain is a ridiculously inefficient contraption thrown together by the building blocks that were practical for production from DNA representations and suitable for the kind of environments animals tended to be exposed to. We should be shocked to find that it also manages to be anywhere near optimal for general intelligence. Among other things it would suggest that evolution packed the wrong lunch.
Okay, I may have misunderstood you. It looks like there is some common ground between us on the issue of inefficiency. I think the brain would probably be inefficient as well as it has to be thrown together by the very specific kind of process of evolution—which is optimized for building things without needing look-ahead intelligence rather than achieving the most efficient results.