It’s about one of the things “truth” means. If you want to apply it to ontology, you need a kind of evidence that’s relevant to ontology—that can distinguish hypotheses that make similar predictions.
Correct me if I’m wrong, but I think we could apply the concept of logical uncertainty to metaphysics and then use Bayes’ theorem to update depending on where our metaphysical research takes us, the way we can use it to update the probability of logically necessarily true/false statements.
It’s about one of the things “truth” means. If you want to apply it to ontology, you need a kind of evidence that’s relevant to ontology—that can distinguish hypotheses that make similar predictions.
Correct me if I’m wrong, but I think we could apply the concept of logical uncertainty to metaphysics and then use Bayes’ theorem to update depending on where our metaphysical research takes us, the way we can use it to update the probability of logically necessarily true/false statements.
How do we use Bayes to find kinds of truth other than predictiveness?