In discussions here (ETA: meaning, in the Less Wrong community), I mostly take it for granted that people have adopted the Bayesian perspective promoted in Eliezer’s sequences. I think that one can make a pretty good argument (although mathematical rigour is too much to ask for) that receiving information through one’s senses can never be enough to justify absolute certainty about anything external. But I’d rather not try to make it here (ETA: meaning, in this discussion thread).
In discussions here (ETA: meaning, in the Less Wrong community), I mostly take it for granted that people have adopted the Bayesian perspective promoted in Eliezer’s sequences. I think that one can make a pretty good argument (although mathematical rigour is too much to ask for) that receiving information through one’s senses can never be enough to justify absolute certainty about anything external. But I’d rather not try to make it here (ETA: meaning, in this discussion thread).