Overconfidence is usually costlier than underconfidence. The cost to become completely accurate is often greater than the benefit of being slightly-inaccurate-but-close-enough.
When these two principles are taken into account, underconfidence becomes an excellent strategy. It also leaves potential in reserve in case of emergencies. As being accurately-confident tends to let others know what you can do, it’s often desirable to create a false appearance.
The cost of underconfidence is an opportunity cost. This is easy to miss, so it will be underweighted—salience bias. This is not a rebuttal, but it is a reason to expect people will falsely conclude that overconfidence is costlier.
I approve of your response, Douglas_Knight, but think that it is both incomplete and somewhat inaccurate.
The cost of underconfidence isn’t necessarily or always an opportunity cost. It can be so, yes. But it can also be not so. You are making a subtle and mostly implicit claim of universality regarding an assertion that is not universally the case.
A strategy doesn’t need to work in every possible contingency to be useful or valid.
Overconfidence and underconfidence both imply a non-optimal amount of confidence. It’s a little oxymoronic to claim that underconfidence is an excellent strategy—if it’s an excellent strategy then it’s presumably not underconfidence. I assume what you are actually claiming is that in general most people would get better results by being less confident than they are? Or are you claiming that relative to accurate judgements of probability of success it is better to consistently under rather than over estimate?
You claim that overconfidence is usually costlier than underconfidence. There are situations where overconfidence has potentially very high cost (overconfidently thinking you can safely overtake on a blind bend perhaps) but in many situations the costs of failure are not as severe as people tend to imagine. Overconfidence (in the sense of estimating greater probability of success than is accurate) can usefully compensate for over estimating the cost of failure in my experience.
You seem to have a pattern of responding to posts with unsupported statements that appear designed more to antagonize than to add useful information to the conversation.
I am replying here instead of higher because I agree with mattnewport, but this is addressed to Annoyance. It is hard to for me to understand what you mean by your post because the links are invisible and I did not instinctively fill them in correctly.
Overconfidence is usually costlier than underconfidence.
As best as I can tell, this is situational. I think mattnewport’s response is accurate. More on this below.
The cost to become completely accurate is often greater than the benefit of being slightly-inaccurate-but-close-enough.
It seems that the two paths from this statement are to stay inaccurate or start getting more efficient at optimizing your accuracy. It sounds too similar to saying, “It is too hard. I give up,” for me to automatically choose inaccuracy. I want to know why it is so hard to become more accurate.
It also seems situational in the sense that it is not always, just often. This is relevant below.
When these two principles are taken into account, underconfidence becomes an excellent strategy.
In addition to mattnewport’s comment about underconfidence implying non-optimal confidence, I think that building this statement on two situational principles is dangerous. Filling out the (situational) blanks leads to this statement:
If underconfidence is less costly than overconfidence and the cost of becoming more accurate is more than the benefit of being more accurate than stay underconfident.
This seems to work just as well as saying this:
If overconfidence is less costly than underconfidence and the cost of becoming more accurate is more than the benefit of being more accurate than stay overconfident.
Which can really be generalized to this:
If it costs more to change your confidence than the resulting benefit, do not change.
Which just leads us back to mattnewport’s comment about optimal confidence. It also seems like it was not the point you were trying to make, so I assume I made a mistake somewhere. As best as I can tell, it was underemphasizing the two situational claims. As a result, I fully understand the request for more support in that area.
It also leaves potential in reserve in case of emergencies. As being accurately-confident tends to let others know what you can do, it’s often desirable to create a false appearance.
Acting overconfident is another form of bluffing. Also, acting one way or the other is a little different than understanding your own limits. How does it help if you bluff yourself?
“Overconfidence and underconfidence both imply a non-optimal amount of confidence.”
Not in the sense of logical implication. The terms refer to levels of confidence greater or lesser than they should be, with the criteria utilized determining what ‘should’ means in context. The utility of the level of confidence isn’t necessarily linked to its accuracy.
Although accuracy is often highly useful, there are times when it’s better to be inaccurate, or to be inaccurate in a particular way, or a particular direction.
“You seem to have a pattern of responding to posts with unsupported statements”
I can support my statements, and support my supports, and support my support supports, but I can’t provide an infinite chain of supports. No one can. The most basic components of any discussion stand by themselves, and are validated or not by comparison with reality. Deal with it.
“that appear designed more to antagonize than to add useful information to the conversation”
They’re crafted to encourage people to think and to facilitate that process to the degree to which that is possible. I can certainly see how people uninterested in thinking would find that unhelpful, even antagonizing. So?
If you are under confident you may pass up risky but worthwhile opportunities, or spend resources on unnecessary safety measures. As for over confidince see hubris.
Also welcome to less wrong.
Overconfidence is usually costlier than underconfidence. The cost to become completely accurate is often greater than the benefit of being slightly-inaccurate-but-close-enough.
When these two principles are taken into account, underconfidence becomes an excellent strategy. It also leaves potential in reserve in case of emergencies. As being accurately-confident tends to let others know what you can do, it’s often desirable to create a false appearance.
The cost of underconfidence is an opportunity cost. This is easy to miss, so it will be underweighted—salience bias. This is not a rebuttal, but it is a reason to expect people will falsely conclude that overconfidence is costlier.
I approve of your response, Douglas_Knight, but think that it is both incomplete and somewhat inaccurate.
The cost of underconfidence isn’t necessarily or always an opportunity cost. It can be so, yes. But it can also be not so. You are making a subtle and mostly implicit claim of universality regarding an assertion that is not universally the case.
A strategy doesn’t need to work in every possible contingency to be useful or valid.
I suspect you are overconfident in that belief. Simply stating something is not a persuasive argument.
“Simply stating something is not a persuasive argument.”
Is simply stating that supposed to be persuasive?
Sooner or later we have to accept or reject arguments on their merits, and that requires evaluating their supports. Not demanding supports for them.
Overconfidence and underconfidence both imply a non-optimal amount of confidence. It’s a little oxymoronic to claim that underconfidence is an excellent strategy—if it’s an excellent strategy then it’s presumably not underconfidence. I assume what you are actually claiming is that in general most people would get better results by being less confident than they are? Or are you claiming that relative to accurate judgements of probability of success it is better to consistently under rather than over estimate?
You claim that overconfidence is usually costlier than underconfidence. There are situations where overconfidence has potentially very high cost (overconfidently thinking you can safely overtake on a blind bend perhaps) but in many situations the costs of failure are not as severe as people tend to imagine. Overconfidence (in the sense of estimating greater probability of success than is accurate) can usefully compensate for over estimating the cost of failure in my experience.
You seem to have a pattern of responding to posts with unsupported statements that appear designed more to antagonize than to add useful information to the conversation.
I am replying here instead of higher because I agree with mattnewport, but this is addressed to Annoyance. It is hard to for me to understand what you mean by your post because the links are invisible and I did not instinctively fill them in correctly.
As best as I can tell, this is situational. I think mattnewport’s response is accurate. More on this below.
It seems that the two paths from this statement are to stay inaccurate or start getting more efficient at optimizing your accuracy. It sounds too similar to saying, “It is too hard. I give up,” for me to automatically choose inaccuracy. I want to know why it is so hard to become more accurate.
It also seems situational in the sense that it is not always, just often. This is relevant below.
In addition to mattnewport’s comment about underconfidence implying non-optimal confidence, I think that building this statement on two situational principles is dangerous. Filling out the (situational) blanks leads to this statement:
This seems to work just as well as saying this:
Which can really be generalized to this:
Which just leads us back to mattnewport’s comment about optimal confidence. It also seems like it was not the point you were trying to make, so I assume I made a mistake somewhere. As best as I can tell, it was underemphasizing the two situational claims. As a result, I fully understand the request for more support in that area.
Acting overconfident is another form of bluffing. Also, acting one way or the other is a little different than understanding your own limits. How does it help if you bluff yourself?
“Overconfidence and underconfidence both imply a non-optimal amount of confidence.”
Not in the sense of logical implication. The terms refer to levels of confidence greater or lesser than they should be, with the criteria utilized determining what ‘should’ means in context. The utility of the level of confidence isn’t necessarily linked to its accuracy.
Although accuracy is often highly useful, there are times when it’s better to be inaccurate, or to be inaccurate in a particular way, or a particular direction.
“You seem to have a pattern of responding to posts with unsupported statements”
I can support my statements, and support my supports, and support my support supports, but I can’t provide an infinite chain of supports. No one can. The most basic components of any discussion stand by themselves, and are validated or not by comparison with reality. Deal with it.
“that appear designed more to antagonize than to add useful information to the conversation”
They’re crafted to encourage people to think and to facilitate that process to the degree to which that is possible. I can certainly see how people uninterested in thinking would find that unhelpful, even antagonizing. So?
Why is confidence or lack thereof an issue aside from personal introspection?
If you are under confident you may pass up risky but worthwhile opportunities, or spend resources on unnecessary safety measures. As for over confidince see hubris. Also welcome to less wrong.