Thanks for your reply here, Val! I’ll just add the following:
There’s a somewhat technical argument that predictions are not the kind of thing classically pointed at by a correspondence theory of truth, which instead tend to be about setting up a structured relationship between propositions and reality and having some firm ground by which to judge the quality of the relationship. So in that sense subjective probability doesn’t really meet the standard of what is normally expected for a correspondence theory of truth since it generally requires, explicitly or implicitly, the possibility of a view from nowhere.
That said, it’s a fair point that we’re still talking about how some part of the world relates to another, so it kinda looks like truth as predictive power is a correspondence theory. However, since we’ve cut out metaphysical assumptions, there’s nothing for these predictions (something we experience) to relate to other than more experience, so at best we have things corresponding to themselves, which breaks down the whole idea of how a correspondence theory of truth is supposed to work (there’s some ground or source (the territory) that we can compare against). A predictive theory of truth is predictions all the way down to unjustified hyperpriors.
I don’t get into this above, but this is why I think “truth” in itself is not that interesting; “usefulness to a purpose” is much more inline with how reasoning actually works, and truth is a kind of usefulness to a purpose, and my case above is a small claim that accurate prediction does a relatively good job of describing what people mean when they point at truth that’s grounded in the most parsimonious story I know to tell about how we think.
Thanks for your reply here, Val! I’ll just add the following:
There’s a somewhat technical argument that predictions are not the kind of thing classically pointed at by a correspondence theory of truth, which instead tend to be about setting up a structured relationship between propositions and reality and having some firm ground by which to judge the quality of the relationship. So in that sense subjective probability doesn’t really meet the standard of what is normally expected for a correspondence theory of truth since it generally requires, explicitly or implicitly, the possibility of a view from nowhere.
That said, it’s a fair point that we’re still talking about how some part of the world relates to another, so it kinda looks like truth as predictive power is a correspondence theory. However, since we’ve cut out metaphysical assumptions, there’s nothing for these predictions (something we experience) to relate to other than more experience, so at best we have things corresponding to themselves, which breaks down the whole idea of how a correspondence theory of truth is supposed to work (there’s some ground or source (the territory) that we can compare against). A predictive theory of truth is predictions all the way down to unjustified hyperpriors.
I don’t get into this above, but this is why I think “truth” in itself is not that interesting; “usefulness to a purpose” is much more inline with how reasoning actually works, and truth is a kind of usefulness to a purpose, and my case above is a small claim that accurate prediction does a relatively good job of describing what people mean when they point at truth that’s grounded in the most parsimonious story I know to tell about how we think.
How does subjective probability require the possibility of a view from nowhere?