Directly, no. But the process of science (like any use of Bayesian reasoning) is intended to gradually make our ontology a better fit to more of reality
Yes, ,it is intended to. Whether , and how it works , are other questions.
There’s also nothing about Bayesianism that guarantees incrementally better ontological fit, in addition to incrementally improving predictive power.
It’s about one of the things “truth” means. If you want to apply it to ontology, you need a kind of evidence that’s relevant to ontology—that can distinguish hypotheses that make similar predictions.
Correct me if I’m wrong, but I think we could apply the concept of logical uncertainty to metaphysics and then use Bayes’ theorem to update depending on where our metaphysical research takes us, the way we can use it to update the probability of logically necessarily true/false statements.
Yes, ,it is intended to. Whether , and how it works , are other questions.
There’s also nothing about Bayesianism that guarantees incrementally better ontological fit, in addition to incrementally improving predictive power.
Bayes’ theorem is about the truth of propositions. Why couldn’t it be applied to propositions about ontology?
It’s about one of the things “truth” means. If you want to apply it to ontology, you need a kind of evidence that’s relevant to ontology—that can distinguish hypotheses that make similar predictions.
Correct me if I’m wrong, but I think we could apply the concept of logical uncertainty to metaphysics and then use Bayes’ theorem to update depending on where our metaphysical research takes us, the way we can use it to update the probability of logically necessarily true/false statements.
How do we use Bayes to find kinds of truth other than predictiveness?
If you are dubious that the methods of rationality work, I fear you are on the wrong website.
I’m not saying they don’t work at all. I have no problem with prediction.
I notice that you didn’t tell me how the methods of rationality work in this particular case. Did you notice that I conceded that they work in others?
If this website is about believing things that cannot be proven, and have never been explained, then it is “rationalist” not rationalist.