I’ve been looking through some of Eliezer’s posts on the subject and the closest I’ve come is “Where Recursive Justification Hits Bottom”, which looks at the problem that if you start with a sufficiently bad prior you will never attain accurate beliefs.
This is a slightly different problem to the one I pointed out (though no less serious, in fact I would say it’s more likely by several orders of magnitude). However, unlike that case, where there really is nothing you can do but try to self improve and hope you started above the cut-off point, my problem seems like it might have an actual solution, I just can’t see what it is.
I’ve been looking through some of Eliezer’s posts on the subject and the closest I’ve come is “Where Recursive Justification Hits Bottom”, which looks at the problem that if you start with a sufficiently bad prior you will never attain accurate beliefs.
This is a slightly different problem to the one I pointed out (though no less serious, in fact I would say it’s more likely by several orders of magnitude). However, unlike that case, where there really is nothing you can do but try to self improve and hope you started above the cut-off point, my problem seems like it might have an actual solution, I just can’t see what it is.