If you ask me whether my reasoning is trustworthy, I guess I’ll look at how I’m thinking at a meta-level and see if there are logical justifications for that category of thinking, plus look at examples of my thinking in the past, and see how often I was right. So roughly your “emperical” and “logical” foundations.
And I sometimes use my reasoning to bootstrap myself to better reasoning. For example, I didn’t used to be Bayesian; I did not intuitively view my beliefs as having probabilities associated with them. Then I read Rationality, and was convinced by both theoretical arguments and practical examples that being Bayesian was a better way of thinking, and now that’s how I think. I had to evaluate the arguments in favor of Bayesianism in terms of my previous means of reasoning—which was overall more haphazard, but fortunately good enough to recognize the upgrade.
From the phrasing you used, it sounded to me like you were searching for some Ultimate Justification that could by definition only be found in regions of the space that have been ruled out by impossibility arguments. But it sounds like you’re well aware of those reasons, and must be looking elsewhere; sorry for misunderstanding.
But honestly I still don’t know what you mean by “trustworthy”. What is the concern, specifically? Is it:
That there are flaws in the way we think, for example the Wikipedia list of biases?
That there’s an influential bias that we haven’t recognized?
That there’s something fundamentally wrong with the way that we reason, such that most of our conclusions are wrong and we can’t even recognize it?
That our reasoning is fine, but we lack a good justification for it?
If you are going to make very confident claims, you need a very strong basis. That’s one sense in which you need trustworthiness. But if you are not going to make very confident claims,you needn’t worry.
If you are going to promote a narrow epistemology based on , for instance just science, or just Bayes, then you a justification for it that doesn’t also justify everything you want to exclude from your narrow epistemology. Circular justification would justify anything that’s self consistent, so it’s not good enough.
If you’re not doing either of the above, then you can just embrace a liberal , pluralistic approach, and not worry .
If you ask me whether my reasoning is trustworthy, I guess I’ll look at how I’m thinking at a meta-level and see if there are logical justifications for that category of thinking, plus look at examples of my thinking in the past, and see how often I was right. So roughly your “emperical” and “logical” foundations.
And I sometimes use my reasoning to bootstrap myself to better reasoning. For example, I didn’t used to be Bayesian; I did not intuitively view my beliefs as having probabilities associated with them. Then I read Rationality, and was convinced by both theoretical arguments and practical examples that being Bayesian was a better way of thinking, and now that’s how I think. I had to evaluate the arguments in favor of Bayesianism in terms of my previous means of reasoning—which was overall more haphazard, but fortunately good enough to recognize the upgrade.
From the phrasing you used, it sounded to me like you were searching for some Ultimate Justification that could by definition only be found in regions of the space that have been ruled out by impossibility arguments. But it sounds like you’re well aware of those reasons, and must be looking elsewhere; sorry for misunderstanding.
But honestly I still don’t know what you mean by “trustworthy”. What is the concern, specifically? Is it:
That there are flaws in the way we think, for example the Wikipedia list of biases?
That there’s an influential bias that we haven’t recognized?
That there’s something fundamentally wrong with the way that we reason, such that most of our conclusions are wrong and we can’t even recognize it?
That our reasoning is fine, but we lack a good justification for it?
Something else?
If you are going to make very confident claims, you need a very strong basis. That’s one sense in which you need trustworthiness. But if you are not going to make very confident claims,you needn’t worry.
If you are going to promote a narrow epistemology based on , for instance just science, or just Bayes, then you a justification for it that doesn’t also justify everything you want to exclude from your narrow epistemology. Circular justification would justify anything that’s self consistent, so it’s not good enough.
If you’re not doing either of the above, then you can just embrace a liberal , pluralistic approach, and not worry .