Hardcoding a knowledge ontology that would include e.g. all concepts humans have ever thought of is theoretically possible, since those concepts are made up of a finite amount of complexity. It’s just that this would take so very long...
Anyway, I wouldn’t rule out that a sufficient knowledge ontology for a FAI could be semi-manually constructed in a century or two, or perhaps a few millenia. It is also theoretically possible that all major players in the world come to an agreement that until then, very strong measures need to be taken to prevent anyone from building anything-that-could-go-UFAI.
I of course wouldn’t claim this probability to be particularly high.
You might actually be able to do some back-of-the-envelope calculations on this. Humans are slow learners, and end up with reasonable ontologies in a finite number of years. By this old estimate, humans learn two bits worth of long term memory content per second. Assuming that people learn with this rate during 16 hours of waking time each day of their life, this would end up something like 32 megabytes of accumulated permanent memory for a 13-year old. 13-year olds can have most of the basic world ontology fixed, and that’s around the age where we stop treating people as children who can be expected to be confused about obvious elements of the world ontology as opposed to subtle ones.
Hand-crafting a concept kernel that compresses down to that order of magnitude doesn’t seem like an impossible task, but it’s possible there’s something very wrong with the memory accumulation rate estimation.
Yes. Those would go into the complexity bound for the human genome, since the genome is pretty much the only information source for human ontogeny. The original post suggested 25 MB, which apparently turned out to be too low. If you make the very conservative assumption that all of the human genome is important, I think the limit is somewhere around 500 MB. The genes needed to build and run the brain are going to be just a fraction of the total genome, but I don’t know enough biology to guess at the size of the fraction.
Anyway, it looks like even in the worst case the code for an AGI that can do interesting stuff out of the box could fit on a single CD-ROM.
Also, by that time, people might be enough more complex that hand-coding all the concepts 21st century people can hold will be an interesting historical project, but not enough for a useful FAI.
Hardcoding a knowledge ontology that would include e.g. all concepts humans have ever thought of is theoretically possible, since those concepts are made up of a finite amount of complexity. It’s just that this would take so very long...
Anyway, I wouldn’t rule out that a sufficient knowledge ontology for a FAI could be semi-manually constructed in a century or two, or perhaps a few millenia. It is also theoretically possible that all major players in the world come to an agreement that until then, very strong measures need to be taken to prevent anyone from building anything-that-could-go-UFAI.
I of course wouldn’t claim this probability to be particularly high.
You might actually be able to do some back-of-the-envelope calculations on this. Humans are slow learners, and end up with reasonable ontologies in a finite number of years. By this old estimate, humans learn two bits worth of long term memory content per second. Assuming that people learn with this rate during 16 hours of waking time each day of their life, this would end up something like 32 megabytes of accumulated permanent memory for a 13-year old. 13-year olds can have most of the basic world ontology fixed, and that’s around the age where we stop treating people as children who can be expected to be confused about obvious elements of the world ontology as opposed to subtle ones.
Hand-crafting a concept kernel that compresses down to that order of magnitude doesn’t seem like an impossible task, but it’s possible there’s something very wrong with the memory accumulation rate estimation.
The 32 megabytes in question should be added to any pre-programmed instincts.
Yes. Those would go into the complexity bound for the human genome, since the genome is pretty much the only information source for human ontogeny. The original post suggested 25 MB, which apparently turned out to be too low. If you make the very conservative assumption that all of the human genome is important, I think the limit is somewhere around 500 MB. The genes needed to build and run the brain are going to be just a fraction of the total genome, but I don’t know enough biology to guess at the size of the fraction.
Anyway, it looks like even in the worst case the code for an AGI that can do interesting stuff out of the box could fit on a single CD-ROM.
Also, by that time, people might be enough more complex that hand-coding all the concepts 21st century people can hold will be an interesting historical project, but not enough for a useful FAI.