Personally, when he was writing the Sequences, I found it a little obnoxious how he kept saying “I was totally on the wrong track and mistaken before I was enlightened & came to understand Bayesian statistics, but now I have a chance of being less wrong”—once is enough, we get it already, I’m not that interested in your intellectual evolution.
Hmm, and the foom belief (for instance) is based on Bayesian statistics how?
That’s pretty damn interesting, because I’ve understood Bayesian statistics for ages, understood how wrong you are without it, and also understood how computationally expensive it is—just think what sort of data you need to attach to each proposition to avoid double counting evidence, to avoid any form of circular updates, to avoid naive Bayesian mistakes… even worse, how prone it is to making faulty conclusions from a partial set of propositions (as generated by e.g. exploring ideas, which btw introduces another form of circularity as you tend to use ideas which you think are probable as starting point more often).
Seriously, he should try to write software that would do updates correctly on a graph with cycles and with correlated propositions. That might result in another enlightenment, hopefully the one not leading to increased confidence, but to decreased confidence. Statistics isn’t easy to do right. And relatively minor bugs easily lead to major errors.
Hmm, and the foom belief (for instance) is based on Bayesian statistics how?
I don’t think it’s based on Bayesian statistics any more than any other belief may (or may not) be based. To take Eliezer specifically, he was interested in the Singularity—specifically, the Good/Vingean observation that a machine more intelligent than us ought to be better than us at creating a still more intelligent machine—long before he had his ‘Bayesian enlightenment’, so his shift to subjective Bayesianism may have increased his belief in intelligence explosions, but certainly didn’t cause it.
Hmm, and the foom belief (for instance) is based on Bayesian statistics how?
That’s pretty damn interesting, because I’ve understood Bayesian statistics for ages, understood how wrong you are without it, and also understood how computationally expensive it is—just think what sort of data you need to attach to each proposition to avoid double counting evidence, to avoid any form of circular updates, to avoid naive Bayesian mistakes… even worse, how prone it is to making faulty conclusions from a partial set of propositions (as generated by e.g. exploring ideas, which btw introduces another form of circularity as you tend to use ideas which you think are probable as starting point more often).
Seriously, he should try to write software that would do updates correctly on a graph with cycles and with correlated propositions. That might result in another enlightenment, hopefully the one not leading to increased confidence, but to decreased confidence. Statistics isn’t easy to do right. And relatively minor bugs easily lead to major errors.
I don’t think it’s based on Bayesian statistics any more than any other belief may (or may not) be based. To take Eliezer specifically, he was interested in the Singularity—specifically, the Good/Vingean observation that a machine more intelligent than us ought to be better than us at creating a still more intelligent machine—long before he had his ‘Bayesian enlightenment’, so his shift to subjective Bayesianism may have increased his belief in intelligence explosions, but certainly didn’t cause it.