The analog would be to theorem proving. No one claims that knowing the axioms of math gets you to every theorem “very fast”—because the problem of finding a proof/disproof for an arbitrary proposition is also uncomputable.
A “solution” might be that only proofs matter, while theorems (as formulas) are in general meaningless in themselves, only useful as commentary on proofs.
Nevertheless, the original point stands: no one says “I’ve discovered math! Now I can can learn the answer to any math problem very fast.” In contrast, you are saying that because we have Solomonoff induction, we can infer distributions “very fast”.
In contrast, you are saying that because we have Solomonoff induction, we can infer distributions “very fast”.
To be more precise, we can specify the denotation of distributions very close to the real deal from very few data. This technical sense doesn’t allow the analogy with theorem-proving, which is about algorithms, not denotation.
When someone says “very fast, but uncomputable”, what I hear is “dragon in garage, but invisible”.
Generalize that to a good chunk of classical math.
The analog would be to theorem proving. No one claims that knowing the axioms of math gets you to every theorem “very fast”—because the problem of finding a proof/disproof for an arbitrary proposition is also uncomputable.
A “solution” might be that only proofs matter, while theorems (as formulas) are in general meaningless in themselves, only useful as commentary on proofs.
Nevertheless, the original point stands: no one says “I’ve discovered math! Now I can can learn the answer to any math problem very fast.” In contrast, you are saying that because we have Solomonoff induction, we can infer distributions “very fast”.
To be more precise, we can specify the denotation of distributions very close to the real deal from very few data. This technical sense doesn’t allow the analogy with theorem-proving, which is about algorithms, not denotation.
But the analogy is in terms of the “fast but uncomputable” oxymoron.