Could Scott_Aaronson or anyone who knows what he’s talking about, please tell me the name of the n/4 left/right bits problem he’s referring to, or otherwise give me a reference for it? His explanation doesn’t seem to make sense: the deterministic algorithm needs to examine 1+n/4 bits only in the worst case, so you can’t compare that to the average output of the random algorithm. (Average case for the determistic would, it seems, be n/8 + 1) Furthermore, I don’t understand how the random method could average out to a size-independent constant.
Is the randomized algorithm one that uses a quantum computer or something?
Could Scott_Aaronson or anyone who knows what he’s talking about, please tell me the name of the n/4 left/right bits problem he’s referring to, or otherwise give me a reference for it? His explanation doesn’t seem to make sense: the deterministic algorithm needs to examine 1+n/4 bits only in the worst case, so you can’t compare that to the average output of the random algorithm. (Average case for the determistic would, it seems, be n/8 + 1) Furthermore, I don’t understand how the random method could average out to a size-independent constant.
Is the randomized algorithm one that uses a quantum computer or something?