Weigh hypotheses based on how many steps it takes for them to be computed on a dovetailing of all Turing machines. This would probably put too much weight on programs that are large but fast to compute.
Weigh hypotheses on how much space it takes to compute them. So dovetail all turing machines of size up to n limited to n bits of space for at most 2^n steps. This has the nice property that the prior is, like the hypotheses, space limited (using about twice as much space as the hypothesis).
Find some quantum algorithm that uses n^k qubits and polynomial time to magically evaluate all programs of length of n in some semi-reasonable way. If such a beast exists (which I doubt), it has the nice property that it “considers” all reasonably sized hypotheses yet runs in polynomial space and time.
Given n bits of evidence about the universe, consider all programs of length up to k*log(n) run for at most n^k2 steps. This has the nice property that it runs in polynomial time and space.
Spit balling hacks around this:
Weigh hypotheses based on how many steps it takes for them to be computed on a dovetailing of all Turing machines. This would probably put too much weight on programs that are large but fast to compute.
Weigh hypotheses on how much space it takes to compute them. So dovetail all turing machines of size up to n limited to n bits of space for at most 2^n steps. This has the nice property that the prior is, like the hypotheses, space limited (using about twice as much space as the hypothesis).
Find some quantum algorithm that uses n^k qubits and polynomial time to magically evaluate all programs of length of n in some semi-reasonable way. If such a beast exists (which I doubt), it has the nice property that it “considers” all reasonably sized hypotheses yet runs in polynomial space and time.
Given n bits of evidence about the universe, consider all programs of length up to k*log(n) run for at most n^k2 steps. This has the nice property that it runs in polynomial time and space.