Human reasoning is not Bayesian because Bayesianism requires perfectly accurate introspective belief about one’s own beliefs.
Human reasoning is not Frequentist because Frequentism requires access to the frequency of an event, which is not accessible because humans cannot remember the past with accuracy.
To be “frequentist” or “bayesian” is merely a philosophical posture about the correct way to update beliefs in response to sense-data. But this is an open problem: the current best solution AFAIK is Logical Induction.
Bayesianism contradicts Rationalism because Rationalism requires stacktraces terminating in irrefutable observation whereas Bayesian logic always terminates in one or more mysterious answers.
Rationalism requires stacktraces terminating in irrefutable observation
Like the previous two commenters, I find this statement odd. I don’t fully trust my senses. I could be dreaming/hallucinating. I don’t fully trust my knowledge of my thoughts. By this definition of a rationalist, I could never be one (and maybe I’m not) because I don’t think there is such a thing as an irrefutable observation. I think there was a joke in that statement, but, unobserved by me, it took flight and now soars somewhere else.
Human reasoning is not Bayesian because Bayesianism requires perfectly accurate introspective belief about one’s own beliefs.
Human reasoning is not Frequentist because Frequentism requires access to the frequency of an event, which is not accessible because humans cannot remember the past with accuracy.
To be “frequentist” or “bayesian” is merely a philosophical posture about the correct way to update beliefs in response to sense-data. But this is an open problem: the current best solution AFAIK is Logical Induction.
“Irrefutable” is another word for “mysterious”.
A prior isn’t the termination.
That sounds like you’re thinking of priors in terms of beliefs. As Gelman recently quoted:
(First sentence is something Gelman is quoting from Deke et al, p4, second sentence is Gelman’s agreement with that.)
Why is xyz your prior? The termination is the information you’ve drawn on to come to that prior.
Like the previous two commenters, I find this statement odd. I don’t fully trust my senses. I could be dreaming/hallucinating. I don’t fully trust my knowledge of my thoughts. By this definition of a rationalist, I could never be one (and maybe I’m not) because I don’t think there is such a thing as an irrefutable observation. I think there was a joke in that statement, but, unobserved by me, it took flight and now soars somewhere else.