Aaronson: What you’re saying—correct me if I’m wrong—is that biological evolution never discovered this fact [error-correcting codes].
You’re not wrong. As you point out, it would halt all beneficial mutations as well. Plus there’d be some difficulty in crossover. Can evolution ever invent something like this? Maybe, or maybe it could just invent more efficient copying methods with 10^-10 error rate. And then a billion years later, much more complex organisms would abound. All of this is irrelevant, given that DNA is on the way out in much less than a million years, but it makes for fun speculation...
Aaronson: If true, this is a beautiful argument for one of two conclusions: either that (1) digital computers shouldn’t have as hard a time as one might think surpassing billions of years of evolution, or (2) 25MB is enough for pretty much anything!
In an amazing and completely unrelated coincidence, I work in the field of Artificial Intelligence. Did I mention DNA is on the way out in much less than a million years?
Cyan, MacKay’s paper talks about gaining bits as in bits on a hard drive, which goes as O(N^1/2) because of Price’s Equation and because variance in the sum of bits goes as the square root of the number of bits. I spent some time scribbling and didn’t prove that this never gains more than one bit of information relative to a fitness function, but I don’t see how eliminating half the organisms in a randomly mixed gene pool can make it gain more than one bit of information in the allele frequencies.
Aaronson: What you’re saying—correct me if I’m wrong—is that biological evolution never discovered this fact [error-correcting codes].
You’re not wrong. As you point out, it would halt all beneficial mutations as well. Plus there’d be some difficulty in crossover. Can evolution ever invent something like this? Maybe, or maybe it could just invent more efficient copying methods with 10^-10 error rate. And then a billion years later, much more complex organisms would abound. All of this is irrelevant, given that DNA is on the way out in much less than a million years, but it makes for fun speculation...
Aaronson: If true, this is a beautiful argument for one of two conclusions: either that (1) digital computers shouldn’t have as hard a time as one might think surpassing billions of years of evolution, or (2) 25MB is enough for pretty much anything!
In an amazing and completely unrelated coincidence, I work in the field of Artificial Intelligence. Did I mention DNA is on the way out in much less than a million years?
Cyan, MacKay’s paper talks about gaining bits as in bits on a hard drive, which goes as O(N^1/2) because of Price’s Equation and because variance in the sum of bits goes as the square root of the number of bits. I spent some time scribbling and didn’t prove that this never gains more than one bit of information relative to a fitness function, but I don’t see how eliminating half the organisms in a randomly mixed gene pool can make it gain more than one bit of information in the allele frequencies.