Doesn’t the “irrelevance” depend on an assumption about how our minds work? If we have some analog computation going on, the bit sequences might be a mere approximation to our actual inductive reasoning. Of course, once we put our conclusions into language, those conclusions can be digitized—but that might happen at a relatively late stage of the reasoning game.
Analog computation is another irrelevance—analog systems can always be approximated arbitrarily closely by discrete systems. Witness the ongoing digial revolution.
Sure, if you are designing a system from the ground up. I’ve never needed more than 12 bits of precision for any real engineering purpose, and I wouldn’t pay a single dollar for more. But Wei_Dai’s question was about us—or at least reads literally that way. Maybe I took it too literally.
There’s no credible evidence that we aren’t digital. E.g. see digital physics. If nobody can tell the difference, the issue is probably in the realm of vacuous philosophy..
Doesn’t the “irrelevance” depend on an assumption about how our minds work? If we have some analog computation going on, the bit sequences might be a mere approximation to our actual inductive reasoning. Of course, once we put our conclusions into language, those conclusions can be digitized—but that might happen at a relatively late stage of the reasoning game.
Analog computation is another irrelevance—analog systems can always be approximated arbitrarily closely by discrete systems. Witness the ongoing digial revolution.
Sure, if you are designing a system from the ground up. I’ve never needed more than 12 bits of precision for any real engineering purpose, and I wouldn’t pay a single dollar for more. But Wei_Dai’s question was about us—or at least reads literally that way. Maybe I took it too literally.
There’s no credible evidence that we aren’t digital. E.g. see digital physics. If nobody can tell the difference, the issue is probably in the realm of vacuous philosophy..