Why should human choices being randomized by some hypothetical primordial ‘freebits’ be any different in practice from them being randomized by the seething lottery-ball bounces of trillions of molecules at dozens of meters per second inside cells? That’s pretty damn random.
In both cases, the question that interests me is whether an external observer could build a model of the human, by non-invasive scanning, that let it forecast the probabilities of future choices in a well-calibrated way. If the freebits or the trillions of bouncing molecules inside cells served only as randomization devices, then they wouldn’t create any obstruction to such forecasts. So the relevant possibility here is that the brain, or maybe other complex systems, can’t be cleanly decomposed into a “digital computation part” and a “microscopic noise part,” such that the former sees the latter purely as a random number source. Again, I certainly don’t know that such a decomposition is impossible, but I also don’t know any strong arguments from physics or biology that assure us it’s possible—as they say, I hope future research will tell us more.
With the brain, by contrast, it’s not nearly so obvious that the “Knightian indeterminism source” can be physically swapped out for a different one, without destroying or radically altering the brain’s cognitive functions as well.
But given the relatively large amplitude of the microscopic thermal noise that CellBioGuy points to, what evolutionary reason would favor a strong role for quantum freebits? After all, thermal noise is far beyond the comprehension of any rival or predator organism. So the organism is safe from being too predictable, even if it harnesses only probabilistic randomization sources. Or it might amplify both types of randomness, thermal noise and quantum freebits alike. But in that case I’d expect the thermal noise to dominate the cognitive and behavioral results, just because thermal noise is so richly available.
My thoughts exactly. Real randomness and sufficiently advanced pseudo randomness would be equally good for practical purposes, all other things being equal, but it might well have been easier to tap into noise than evolve a PRNG. So we may have ended up with incompatibilist FW by a kind if accident.
Why should human choices being randomized by some hypothetical primordial ‘freebits’ be any different in practice from them being randomized by the seething lottery-ball bounces of trillions of molecules at dozens of meters per second inside cells? That’s pretty damn random.
In both cases, the question that interests me is whether an external observer could build a model of the human, by non-invasive scanning, that let it forecast the probabilities of future choices in a well-calibrated way. If the freebits or the trillions of bouncing molecules inside cells served only as randomization devices, then they wouldn’t create any obstruction to such forecasts. So the relevant possibility here is that the brain, or maybe other complex systems, can’t be cleanly decomposed into a “digital computation part” and a “microscopic noise part,” such that the former sees the latter purely as a random number source. Again, I certainly don’t know that such a decomposition is impossible, but I also don’t know any strong arguments from physics or biology that assure us it’s possible—as they say, I hope future research will tell us more.
In the paper you also wrote
But given the relatively large amplitude of the microscopic thermal noise that CellBioGuy points to, what evolutionary reason would favor a strong role for quantum freebits? After all, thermal noise is far beyond the comprehension of any rival or predator organism. So the organism is safe from being too predictable, even if it harnesses only probabilistic randomization sources. Or it might amplify both types of randomness, thermal noise and quantum freebits alike. But in that case I’d expect the thermal noise to dominate the cognitive and behavioral results, just because thermal noise is so richly available.
My thoughts exactly. Real randomness and sufficiently advanced pseudo randomness would be equally good for practical purposes, all other things being equal, but it might well have been easier to tap into noise than evolve a PRNG. So we may have ended up with incompatibilist FW by a kind if accident.