Eliezer, sorry for spamming, but I think I finally understand what you were getting at.
Von Neumann showed in the 50′s that there’s no in-principle limit to how big a computer one can build: even if some fraction p of the bits get corrupted at every time step, as long as p is smaller than some threshold one can use a hierarchical error-correcting code to correct the errors faster than they happen. Today we know that the same is true even for quantum computers.
What you’re saying—correct me if I’m wrong—is that biological evolution never discovered this fact.
If true, this is a beautiful argument for one of two conclusions: either that (1) digital computers shouldn’t have as hard a time as one might think surpassing billions of years of evolution, or (2) 25MB is enough for pretty much anything!
Eliezer, sorry for spamming, but I think I finally understand what you were getting at.
Von Neumann showed in the 50′s that there’s no in-principle limit to how big a computer one can build: even if some fraction p of the bits get corrupted at every time step, as long as p is smaller than some threshold one can use a hierarchical error-correcting code to correct the errors faster than they happen. Today we know that the same is true even for quantum computers.
What you’re saying—correct me if I’m wrong—is that biological evolution never discovered this fact.
If true, this is a beautiful argument for one of two conclusions: either that (1) digital computers shouldn’t have as hard a time as one might think surpassing billions of years of evolution, or (2) 25MB is enough for pretty much anything!