I didn’t get the impression that Bayesian inference itself was going to produce intelligence
I do get that impression from people who blithely talk of “Bayesian superintelligences”. Example. What work is the word “Bayesian” doing there?
In this example, a Bayesian superintelligence is conceived as having a prior distribution over all possible hypotheses (for example, a complexity-based prior) and using its observations to optimally converge on the right one. You can even make a theoretically optimal learning algorithm that provably converges on the best hypothesis. (I forget the reference for this.) Where this falls down is the exponential explosion of hypothesis space with complexity. There no use in a perfect optimiser that takes longer than the age of the universe to do anything useful.
I do get that impression from people who blithely talk of “Bayesian superintelligences”. Example. What work is the word “Bayesian” doing there?
In this example, a Bayesian superintelligence is conceived as having a prior distribution over all possible hypotheses (for example, a complexity-based prior) and using its observations to optimally converge on the right one. You can even make a theoretically optimal learning algorithm that provably converges on the best hypothesis. (I forget the reference for this.) Where this falls down is the exponential explosion of hypothesis space with complexity. There no use in a perfect optimiser that takes longer than the age of the universe to do anything useful.