Its actually O(n log n) if you use a cleverer approach, which is fortunate (to get the length in log n you do a binary search. To get it down to 1 thing of that length in log n you do a binary search on the number of psuedorandom constraints which can be simultaneously satisfied).
You use O(log n) genies in the process, each of which you must spend O(n) time verifying. Its an O(log n) slowdown from just getting the answer directly.
I see how you can use log(n) genies to get the length, but I don’t think you can then use log(n) genies to get the lexicographically first proof of length n. That only provides enough bits of information to uniquely identify a proof if there are at most O(2^(log n)) = O(n) of them, which is not necessarily the case.
You can’t get the lexicographically first proof. You can get a uniformly random one though. (I don’t remember who the argument is due to—probably Vazirani). This is a very rough sketch.
If there are initially N solutions and you add k random constraints, each satisfied by half of all possible proofs, then you expect to have N / 2^k remaining solutions. You can do a binary search on k to find k ~ log N, at which point there are 1-2 solutions. The genie has to say more in the last round, but there is only one thing he can say that satisfies all of the random constraints (you can do that all carefully with a negligible chance of error). Since log N is at most n, the number of bits, the binary search takes log n time.
Its actually O(n log n) if you use a cleverer approach, which is fortunate (to get the length in log n you do a binary search. To get it down to 1 thing of that length in log n you do a binary search on the number of psuedorandom constraints which can be simultaneously satisfied).
I’m confused. Which thing are you saying is O(n log n)?
You use O(log n) genies in the process, each of which you must spend O(n) time verifying. Its an O(log n) slowdown from just getting the answer directly.
I see how you can use log(n) genies to get the length, but I don’t think you can then use log(n) genies to get the lexicographically first proof of length n. That only provides enough bits of information to uniquely identify a proof if there are at most O(2^(log n)) = O(n) of them, which is not necessarily the case.
You can’t get the lexicographically first proof. You can get a uniformly random one though. (I don’t remember who the argument is due to—probably Vazirani). This is a very rough sketch.
If there are initially N solutions and you add k random constraints, each satisfied by half of all possible proofs, then you expect to have N / 2^k remaining solutions. You can do a binary search on k to find k ~ log N, at which point there are 1-2 solutions. The genie has to say more in the last round, but there is only one thing he can say that satisfies all of the random constraints (you can do that all carefully with a negligible chance of error). Since log N is at most n, the number of bits, the binary search takes log n time.