But more than that, I am trying to switch it off when talking to other people, for the simple reason (and I’m sure this has already been pointed out before) that if you compare three people, one who estimates the probability of an event at 110%, one who estimates it at 90%, and one who compensates for overconfidence bias and estimates it at 65%, the first two will win friends and influence people, while the third will seem indecisive.
Made me think of this article. Yes, you may be able, in the short run, to win friends and influence people by tricking yourself into being overconfident. But that belief is only in your head and doesn’t affect the universe–thus doesn’t affect the probability of Event X happening. Which means that if, realistically, X is 65% likely to happen, then you with your overconfidence, claiming that X is bound to happen, will eventually look like a fool 35% of the time, and will make it hard for yourself to leave a line of retreat.
Conclusion: in the long run, it’s very good to be honest with yourself about your predictions of the future, and probably preferable to be honest with others, too, if you want to recruit their support.
Excellent points, and of course it is situation dependent—if one makes erroneous predictions on archived forms of communication, e.g. these posts, then yes these predictions can come back to haunt you, but often, especially in non-archived communications, people will remember the correct predictions and forget the false ones. It should go without saying that I do not intend to be overconfident on LW—if I was going to be, then the last thing I would do is announce this intention!
In a strange way, I seem to want to hold three different beliefs:
1) An accurate assessment of what will happen, for planning my own actions.
2) A confidant, stopping just short of arrogant, belief in my predictions for impressing non-rationalists.
3) An unshakeable belief in my own invincibility, so that psychosomatic effects keep me healthy.
Unfortunately, this kinda sounds like “I want to have multiple personality disorder”.
So, call -C1 the social cost of reporting a .9 confidence of something that turns out false, and -C2 the social cost of reporting a .65 confidence of something that turns out false. Call C3 the benefit of reporting .9 confidence of something true, and C4 the benefit of .65 confidence.
How confident are you that that (.65C3 -.35C1) < (.65C4-.35C2)?
In certain situations, such as sporting events which do not involve betting, my confidence that (.65C3 -.35C1) < (.65C4-.35C2) is at most 10%. In these situations confidence is valued far more that epistemic rationality.
I would say I’m about 75% confident that (.65C3 -.35C1) < (.65C4-.35C2)… But one of the reasons I don’t even want to play that game is that I feel I am completely unqualified to estimate probabilities about that, and most other things. I would have no idea how to go about estimating the probability of, for example, the Singularity occurring before 2050...much less how to compensate for biases in my estimate.
I think I also have somewhat of an ick reaction towards the concept of “tricking” people to get what you want, even if in a very subtle form. I just...like...being honest, and it’s hard for me to tell if my arguments about honesty being better are rationalizations because I don’t want being dishonest to be justifiable.
The way to bridge that gap is to only volunteer predictions when you’re quite confident, and otherwise stay quiet, change the subject, or murmur a polite assent. You’re absolutely right that explicitly declaring a 65% confidence estimate will make you look indecisive—but people aren’t likely to notice that you make predictions less often than other people—they’ll be too focused on how when you do make predictions, you have an uncanny tendency to be correct...and also that you’re pleasantly modest and demure, too.
Welcome!
Made me think of this article. Yes, you may be able, in the short run, to win friends and influence people by tricking yourself into being overconfident. But that belief is only in your head and doesn’t affect the universe–thus doesn’t affect the probability of Event X happening. Which means that if, realistically, X is 65% likely to happen, then you with your overconfidence, claiming that X is bound to happen, will eventually look like a fool 35% of the time, and will make it hard for yourself to leave a line of retreat.
Conclusion: in the long run, it’s very good to be honest with yourself about your predictions of the future, and probably preferable to be honest with others, too, if you want to recruit their support.
Excellent points, and of course it is situation dependent—if one makes erroneous predictions on archived forms of communication, e.g. these posts, then yes these predictions can come back to haunt you, but often, especially in non-archived communications, people will remember the correct predictions and forget the false ones. It should go without saying that I do not intend to be overconfident on LW—if I was going to be, then the last thing I would do is announce this intention! In a strange way, I seem to want to hold three different beliefs: 1) An accurate assessment of what will happen, for planning my own actions. 2) A confidant, stopping just short of arrogant, belief in my predictions for impressing non-rationalists. 3) An unshakeable belief in my own invincibility, so that psychosomatic effects keep me healthy.
Unfortunately, this kinda sounds like “I want to have multiple personality disorder”.
If you’re going to go that route, at least research it first. For example:
http://healthymultiplicity.com/
Thanks for the advice, but I don’t actually want to have multiple personality disorder—I was just drawing an analogy.
Hm.
So, call -C1 the social cost of reporting a .9 confidence of something that turns out false, and -C2 the social cost of reporting a .65 confidence of something that turns out false. Call C3 the benefit of reporting .9 confidence of something true, and C4 the benefit of .65 confidence.
How confident are you that that (.65C3 -.35C1) < (.65C4-.35C2)?
In certain situations, such as sporting events which do not involve betting, my confidence that (.65C3 -.35C1) < (.65C4-.35C2) is at most 10%. In these situations confidence is valued far more that epistemic rationality.
I would say I’m about 75% confident that (.65C3 -.35C1) < (.65C4-.35C2)… But one of the reasons I don’t even want to play that game is that I feel I am completely unqualified to estimate probabilities about that, and most other things. I would have no idea how to go about estimating the probability of, for example, the Singularity occurring before 2050...much less how to compensate for biases in my estimate.
I think I also have somewhat of an ick reaction towards the concept of “tricking” people to get what you want, even if in a very subtle form. I just...like...being honest, and it’s hard for me to tell if my arguments about honesty being better are rationalizations because I don’t want being dishonest to be justifiable.
The way to bridge that gap is to only volunteer predictions when you’re quite confident, and otherwise stay quiet, change the subject, or murmur a polite assent. You’re absolutely right that explicitly declaring a 65% confidence estimate will make you look indecisive—but people aren’t likely to notice that you make predictions less often than other people—they’ll be too focused on how when you do make predictions, you have an uncanny tendency to be correct...and also that you’re pleasantly modest and demure, too.
(nods) That makes sense.