Where do you find in that link the suggestion that rationalists should be less confident?
“Beware lest you become attached to beliefs you may not want.”
“Surrender to the truth as quickly as you can.”
One who sees that people generally overestimate themselves, and responds by downgrading their own self-confidence, imitates the outward form of the art without the substance.
Not necessarily. If there is no known way to correct for a bias, it makes sense to do the sort of gross correction I described. For example, if I know that I and my coworker underestimate how long my projects take but I’m not aware of any technique I can use to improve my estimates, I could start by asking my co-worker to do all the estimates and then multiply that estimate by two when telling my boss.
Is there are known way of correcting for human overconfidence? If not, I think the sort of gross correction I describe makes sense from an epistemically rational point of view.
Someone who “feels” it is their “duty” to do something is someone who already does not want to do it, so by definition the second has motivation and the first does not. But these are imaginary people and this is fictional evidence. A real answer to the rhetorical question might be found by surveying mathematics students, comparing those who stay the course and those who drop out. I do not know what such a survey would find.
Do you deny that believing you had the answer to a mathematical problem and only lacked a proof would be a powerful motivator to think about mathematics? I was once in this situation, and it certainly motivated me.
Someone who “feels” it is their “duty” to do something is someone who already does not want to do it, so by definition the second has motivation and the first does not.
The way I used “duty” has nothing to do with disliking a thing. To me, “duty” describes something that you feel you ought to do. It’s just an inconvenient fact about human psychology that telling yourself that something’s best makes it hard to do it. Being epistemically rational (figuring out what the best thing to do is, then compelling yourself to do it) often seems not to work for humans.
If there is no known way to correct for a bias, the thing to do is to find one. Swerving an arbitrary amount in the right direction will not do—reversed stupidity etc.
I once saw a poster in a chemist’s shop bluntly asserting, “We all eat too much salt.” What was I supposed to do about that? No matter how little salt I take in, or how far I reduce it, that poster would still be telling me the same thing. No, the thing to do, if I think it worth attending to, would be to find out my actual salt intake and what it should actually be. Then “surrender to the truth” and confidently do what the result of that enquiry tells me.
If someone finds it hard to do what they believe that they should and can, then their belief is mistaken, or at least incomplete. They have other reasons for not doing whatever it is, reasons that they are probably unaware of when they merely fret about what they ought to be doing. Compelling oneself is unnecessary when there is nothing to overcome. The root of indecision is conflict, not doubt; irrationality, not rationality.
Here’s a quote about rationality in action from a short story recently mentioned on LW, a classic of SF that everyone with an interest in rationality should read. I find that a more convincing picture than one of supine doubt.
Swerving an arbitrary amount in the right direction will not do—reversed stupidity etc.
Reversing stupidity is not the same thing as swerving an arbitrary amount in the right direction. And the amount is not arbitrary: like most of my belief changes, it is based on my intuition. This post by Robin Hanson springs to mind; see the last sentence before the edit.
Anyway, some positive thoughts I have about myself are obviously unwarranted. I’m currently in the habit of immediately doubting spontaneous positive thoughts (because of what I’ve read about overconfidence), but I’m beginning to suspect that my habit is self-destructive.
If someone finds it hard to do what they believe that they should and can, then their belief is mistaken, or at least incomplete.
Well yes, of course, it’s easier to do something if you believe you can. That’s what I’m talking about—confidence (i.e. believing you can do something) is valuable. If there’s no chance of the thing going wrong, then you’re often best off being overconfident to attain this benefit. That’s pretty much my point right there.
As for your Heinlein quote, I find it completely unrealistic. Either I am vastly overestimating myself as one of Heinlein’s elite, I am a terrible judge of people because I put so many of them into his elite, or Heinlein is wrong. I find it ironic, however, that someone who read the quote would probably be pushed towards the state of mind I am advocating: I’m pretty sure 95% of those who read it put themselves somewhere in the upper echelons, and once in the upper echelons, they are free to estimate their ability highly and succeed as a result.
I’m currently in the habit of immediately doubting spontaneous positive thoughts (because of what I’ve read about overconfidence), but I’m beginning to suspect that my habit is self-destructive.
Are you in the habit of immediately doubting negative thoughts as well? All emotionally-laden spontaneous cognitive content should be suspect.
Also, when you correct an overly positive self-assessment, do you try to describe it as a growth opportunity? This violates no principles of rationality, and seems like it could mitigate the self-destruction. (See fixed vs. growth theories of intelligence.)
I’m currently in the habit of immediately doubting spontaneous positive thoughts (because of what I’ve read about overconfidence), but I’m beginning to suspect that my habit is self-destructive.
Are you in the habit of immediately doubting negative thoughts as well? All emotionally-laden spontaneous cognitive content should be suspect.
Also, when you correct an overly positive self-assessment, do you try to describe it as a growth opportunity? This violates no principles of rationality, and seems like it could mitigate the self-destruction. (See fixed vs. growth theories of intelligence).
If you find it; though this is nitpicking, as the net effect usually will be as you say. Still, this is completely different from the unconditional injunction to be less confident that the post suggests.
“Beware lest you become attached to beliefs you may not want.”
“Surrender to the truth as quickly as you can.”
Not necessarily. If there is no known way to correct for a bias, it makes sense to do the sort of gross correction I described. For example, if I know that I and my coworker underestimate how long my projects take but I’m not aware of any technique I can use to improve my estimates, I could start by asking my co-worker to do all the estimates and then multiply that estimate by two when telling my boss.
Is there are known way of correcting for human overconfidence? If not, I think the sort of gross correction I describe makes sense from an epistemically rational point of view.
Do you deny that believing you had the answer to a mathematical problem and only lacked a proof would be a powerful motivator to think about mathematics? I was once in this situation, and it certainly motivated me.
The way I used “duty” has nothing to do with disliking a thing. To me, “duty” describes something that you feel you ought to do. It’s just an inconvenient fact about human psychology that telling yourself that something’s best makes it hard to do it. Being epistemically rational (figuring out what the best thing to do is, then compelling yourself to do it) often seems not to work for humans.
If there is no known way to correct for a bias, the thing to do is to find one. Swerving an arbitrary amount in the right direction will not do—reversed stupidity etc.
I once saw a poster in a chemist’s shop bluntly asserting, “We all eat too much salt.” What was I supposed to do about that? No matter how little salt I take in, or how far I reduce it, that poster would still be telling me the same thing. No, the thing to do, if I think it worth attending to, would be to find out my actual salt intake and what it should actually be. Then “surrender to the truth” and confidently do what the result of that enquiry tells me.
If someone finds it hard to do what they believe that they should and can, then their belief is mistaken, or at least incomplete. They have other reasons for not doing whatever it is, reasons that they are probably unaware of when they merely fret about what they ought to be doing. Compelling oneself is unnecessary when there is nothing to overcome. The root of indecision is conflict, not doubt; irrationality, not rationality.
Here’s a quote about rationality in action from a short story recently mentioned on LW, a classic of SF that everyone with an interest in rationality should read. I find that a more convincing picture than one of supine doubt.
Reversing stupidity is not the same thing as swerving an arbitrary amount in the right direction. And the amount is not arbitrary: like most of my belief changes, it is based on my intuition. This post by Robin Hanson springs to mind; see the last sentence before the edit.
Anyway, some positive thoughts I have about myself are obviously unwarranted. I’m currently in the habit of immediately doubting spontaneous positive thoughts (because of what I’ve read about overconfidence), but I’m beginning to suspect that my habit is self-destructive.
Well yes, of course, it’s easier to do something if you believe you can. That’s what I’m talking about—confidence (i.e. believing you can do something) is valuable. If there’s no chance of the thing going wrong, then you’re often best off being overconfident to attain this benefit. That’s pretty much my point right there.
As for your Heinlein quote, I find it completely unrealistic. Either I am vastly overestimating myself as one of Heinlein’s elite, I am a terrible judge of people because I put so many of them into his elite, or Heinlein is wrong. I find it ironic, however, that someone who read the quote would probably be pushed towards the state of mind I am advocating: I’m pretty sure 95% of those who read it put themselves somewhere in the upper echelons, and once in the upper echelons, they are free to estimate their ability highly and succeed as a result.
Are you in the habit of immediately doubting negative thoughts as well? All emotionally-laden spontaneous cognitive content should be suspect.
Also, when you correct an overly positive self-assessment, do you try to describe it as a growth opportunity? This violates no principles of rationality, and seems like it could mitigate the self-destruction. (See fixed vs. growth theories of intelligence.)
Are you in the habit of immediately doubting negative thoughts as well? All emotionally-laden spontaneous cognitive content should be suspect.
Also, when you correct an overly positive self-assessment, do you try to describe it as a growth opportunity? This violates no principles of rationality, and seems like it could mitigate the self-destruction. (See fixed vs. growth theories of intelligence).
AFAICT, this means to seek disconfirming evidence, and update if and when you find it. Nothing to do with confidence.
Disconfirming evidence makes you less confident that your original beliefs were true.
If you find it; though this is nitpicking, as the net effect usually will be as you say. Still, this is completely different from the unconditional injunction to be less confident that the post suggests.