Do not accept any of my words on faith, Believing them just because I said them. Be like an analyst buying gold, who cuts, burns, And critically examines his product for authenticity. Only accept what passes the test By proving useful and beneficial in your life.
“Proving useful in your life” (but not necessarily “proving beneficial”) is the core of instrumental rationality, but what’s useful is not necessarily what’s true, so it’s important to refrain from using that metric in epistemic rationality.
I suppose that is a tension between epistemic and instrumental rationality.
Put in terms of a microeconomic trade-off: The marginal value of having correct beliefs diminishes beyond a certain threshold. Eventually, the marginal value of increasing one’s epistemic accuracy dips below the marginal value that comes from retaining one’s mistaken belief. At that point, an instrumentally rational agent may stop increasing accuracy.
On the other hand, it may be a problem of local-versus-global optima: The marginal value of accuracy may creep up again. Or maybe those who see it as a problem can fix it with the right augmentation.
I suppose that is a tension between epistemic and instrumental rationality.
There is no tension. Epistemic rationality is merely instrumental, while instrumental rationality is not. They are different kinds of things. Means to an end don’t compete with what the end is.
This sometimes comes at the expense of other truths, just as pursuing evidence for your preferred conclusion turns up real evidence but a less accurate map.
Do not accept any of my words on faith,
Believing them just because I said them.
Be like an analyst buying gold, who cuts, burns,
And critically examines his product for authenticity.
Only accept what passes the test
By proving useful and beneficial in your life.
-- The Buddha, Jnanasara-samuccaya Sutra
Good instrumental rationality quote; not so good for epistemic rationality.
Why do you say that?
“Proving useful in your life” (but not necessarily “proving beneficial”) is the core of instrumental rationality, but what’s useful is not necessarily what’s true, so it’s important to refrain from using that metric in epistemic rationality.
Example: cognitive behavioral therapy is often useful “to solve problems concerning dysfunctional emotions”, but not useful for pursuing truth. There’s also mindfulness-based cognitive therapy for an example more relevant to Buddhism.
I suppose that is a tension between epistemic and instrumental rationality.
Put in terms of a microeconomic trade-off: The marginal value of having correct beliefs diminishes beyond a certain threshold. Eventually, the marginal value of increasing one’s epistemic accuracy dips below the marginal value that comes from retaining one’s mistaken belief. At that point, an instrumentally rational agent may stop increasing accuracy.
On the other hand, it may be a problem of local-versus-global optima: The marginal value of accuracy may creep up again. Or maybe those who see it as a problem can fix it with the right augmentation.
There is no tension. Epistemic rationality is merely instrumental, while instrumental rationality is not. They are different kinds of things. Means to an end don’t compete with what the end is.
Upvoted for this
It is useful for pursuing truth to the extent that it can correct actually false beliefs when they happen to tend in one direction.
This sometimes comes at the expense of other truths, just as pursuing evidence for your preferred conclusion turns up real evidence but a less accurate map.
Related quote from Epictetus.