I wonder if there’s a simple worst-case proof that shows how complicated you need to let the seeds get in order to find the actual optimum. For example, if we look for the best integer under 10^85 rather than under 10^100, the seed that leads to this algorithm outputting the optimum is different, or at least the overlap seems small. But I’m having a hard time proving anything about this algorithm, because although small seed numerators could add up to almost anything, in practice they won’t.
I wonder if there’s a simple worst-case proof that shows how complicated you need to let the seeds get in order to find the actual optimum. For example, if we look for the best integer under 10^85 rather than under 10^100, the seed that leads to this algorithm outputting the optimum is different, or at least the overlap seems small. But I’m having a hard time proving anything about this algorithm, because although small seed numerators could add up to almost anything, in practice they won’t.