Hmm, and the foom belief (for instance) is based on Bayesian statistics how?
I don’t think it’s based on Bayesian statistics any more than any other belief may (or may not) be based. To take Eliezer specifically, he was interested in the Singularity—specifically, the Good/Vingean observation that a machine more intelligent than us ought to be better than us at creating a still more intelligent machine—long before he had his ‘Bayesian enlightenment’, so his shift to subjective Bayesianism may have increased his belief in intelligence explosions, but certainly didn’t cause it.
I don’t think it’s based on Bayesian statistics any more than any other belief may (or may not) be based. To take Eliezer specifically, he was interested in the Singularity—specifically, the Good/Vingean observation that a machine more intelligent than us ought to be better than us at creating a still more intelligent machine—long before he had his ‘Bayesian enlightenment’, so his shift to subjective Bayesianism may have increased his belief in intelligence explosions, but certainly didn’t cause it.