You might actually be able to do some back-of-the-envelope calculations on this. Humans are slow learners, and end up with reasonable ontologies in a finite number of years. By this old estimate, humans learn two bits worth of long term memory content per second. Assuming that people learn with this rate during 16 hours of waking time each day of their life, this would end up something like 32 megabytes of accumulated permanent memory for a 13-year old. 13-year olds can have most of the basic world ontology fixed, and that’s around the age where we stop treating people as children who can be expected to be confused about obvious elements of the world ontology as opposed to subtle ones.
Hand-crafting a concept kernel that compresses down to that order of magnitude doesn’t seem like an impossible task, but it’s possible there’s something very wrong with the memory accumulation rate estimation.
Yes. Those would go into the complexity bound for the human genome, since the genome is pretty much the only information source for human ontogeny. The original post suggested 25 MB, which apparently turned out to be too low. If you make the very conservative assumption that all of the human genome is important, I think the limit is somewhere around 500 MB. The genes needed to build and run the brain are going to be just a fraction of the total genome, but I don’t know enough biology to guess at the size of the fraction.
Anyway, it looks like even in the worst case the code for an AGI that can do interesting stuff out of the box could fit on a single CD-ROM.
You might actually be able to do some back-of-the-envelope calculations on this. Humans are slow learners, and end up with reasonable ontologies in a finite number of years. By this old estimate, humans learn two bits worth of long term memory content per second. Assuming that people learn with this rate during 16 hours of waking time each day of their life, this would end up something like 32 megabytes of accumulated permanent memory for a 13-year old. 13-year olds can have most of the basic world ontology fixed, and that’s around the age where we stop treating people as children who can be expected to be confused about obvious elements of the world ontology as opposed to subtle ones.
Hand-crafting a concept kernel that compresses down to that order of magnitude doesn’t seem like an impossible task, but it’s possible there’s something very wrong with the memory accumulation rate estimation.
The 32 megabytes in question should be added to any pre-programmed instincts.
Yes. Those would go into the complexity bound for the human genome, since the genome is pretty much the only information source for human ontogeny. The original post suggested 25 MB, which apparently turned out to be too low. If you make the very conservative assumption that all of the human genome is important, I think the limit is somewhere around 500 MB. The genes needed to build and run the brain are going to be just a fraction of the total genome, but I don’t know enough biology to guess at the size of the fraction.
Anyway, it looks like even in the worst case the code for an AGI that can do interesting stuff out of the box could fit on a single CD-ROM.