Consider the problem of an agent who is offered a chance to improve their epistemic rationality for a price. What is such an agent’s optimal strategy?
A complete answer to this problem would involve a mathematical model to estimate the expected increase in utility associated with having more correct beliefs. I don’t have a complete answer, but I’m pretty sure about one thing: From an instrumental rationalist’s point of view, to always accept or always refuse such offers is downright irrational.
And now for the kicker: You might be such an agent.
One technique that humans can use to work towards epistemic rationality is to doubt themselves, since most people think they are above average in a wide variety of areas (and it’s reasonable to assume that merit in at least some of these areas is normally distributed.) But having a negative explanatory style, which is one way to doubt yourself, has been linked with sickness and depression.
And the inverse is also true. Humans also seem to be rewarded for a certain set of beliefs: those that help them maintain a somewhat-good assessment of themselves. Having an optimistic explanatory style (in an nutshell, explaining good events in a way that makes you feel good, and explaining bad events in a way that doesn’t make you feel bad) has been linked with success in sports, sales and school.
If you’re unswayed by my empirical arguments, here’s a theoretical one. If you’re a human and you want to have correct beliefs, you must make a special effort to seek evidence that your beliefs are wrong. One of our known defects is our tendency to stick with our beliefs for too long. But if you do this successfully, you will become less certain and therefore less determined.
In some circumstances, it’s good to be less determined. But in others, it’s not. And to say that one should always look for disconfirming evidence, or that one should alwaysavoid looking for disconfirming evidence, is idealogical according to the instrumental rationalist.
Who do you think is going to be more motivated to think about math: someone who feels it is their duty to become smarter, or a naive student who believes he or she has the answer to some mathematical problem and is only lacking a proof?
You rarely see a self-help book, entreprenuership guide, or personal development blog telling people how to be less confident. But that’s what an advocate of rationalism does. The question is, do the benefits outweigh the costs?
Accuracy Versus Winning
Consider the problem of an agent who is offered a chance to improve their epistemic rationality for a price. What is such an agent’s optimal strategy?
A complete answer to this problem would involve a mathematical model to estimate the expected increase in utility associated with having more correct beliefs. I don’t have a complete answer, but I’m pretty sure about one thing: From an instrumental rationalist’s point of view, to always accept or always refuse such offers is downright irrational.
And now for the kicker: You might be such an agent.
One technique that humans can use to work towards epistemic rationality is to doubt themselves, since most people think they are above average in a wide variety of areas (and it’s reasonable to assume that merit in at least some of these areas is normally distributed.) But having a negative explanatory style, which is one way to doubt yourself, has been linked with sickness and depression.
And the inverse is also true. Humans also seem to be rewarded for a certain set of beliefs: those that help them maintain a somewhat-good assessment of themselves. Having an optimistic explanatory style (in an nutshell, explaining good events in a way that makes you feel good, and explaining bad events in a way that doesn’t make you feel bad) has been linked with success in sports, sales and school.
If you’re unswayed by my empirical arguments, here’s a theoretical one. If you’re a human and you want to have correct beliefs, you must make a special effort to seek evidence that your beliefs are wrong. One of our known defects is our tendency to stick with our beliefs for too long. But if you do this successfully, you will become less certain and therefore less determined.
In some circumstances, it’s good to be less determined. But in others, it’s not. And to say that one should always look for disconfirming evidence, or that one should always avoid looking for disconfirming evidence, is idealogical according to the instrumental rationalist.
Who do you think is going to be more motivated to think about math: someone who feels it is their duty to become smarter, or a naive student who believes he or she has the answer to some mathematical problem and is only lacking a proof?
You rarely see a self-help book, entreprenuership guide, or personal development blog telling people how to be less confident. But that’s what an advocate of rationalism does. The question is, do the benefits outweigh the costs?