There are not an infinite number of possible hypotheses in a great many sensible situations. For example, suppose the question is “who murdered Fred?”, because we have already learned that he was murdered. The already known answer: “A human alive at the time he died.”, makes the set finite. If we can determine when and where he died, the number of suspects can typically be reduced to dozens or hundreds. Limiting to someone capable of carrying out the means of death may cut 90% of them.
To the extent that “bits” of evidence means things that we don’t know yet, the number of bits can be much smaller than suggested. To the extent that “bits” of evidence includes everything we know so far, we all have trillions of bits already in our brains and the minimal number is meaningless.
The mathematical inconsistency between quantum mechanics and general relativity illustrates a key point. Most of the time the hypothesis set for new solutions, rather than being infnite, is null. It is often quite easy to illustrate that every available theory is wrong. Even if we know that our theory is clearly inconsistent with reality, we still keep using it until we come up with something better. Even if General Relativity were contradicted by some experimental discovery in 1963, Einstein would still have been lauded as a scientist for finding a theory that fit more data points that the previous one.
In science, and in a lot of other contexts, simply showing that a theory could be right, is much more important the establishing to any degree of statistical significance that it is right.