First, disambiguate the word “confidence”. There are numbers we call confidence—cold calculations that feed into other cold calculations. And then there’s an emotion we call confidence. These are not the same thing, and should not be subject to the same policies. To a first approximation, the numbers should be truthful and the emotion should be wildly over-optimistic.
So you disagree with Tordmor, and are arguing that we should never act like the Little Engine That Could by saying ‘I think I can’ to increase our credence in our likelihood of success, even if doing so helps us win more often? This is not totally implausible; I’ve heard of cases where not making Bayesian updates is the rational choice in game theoretic situations (where ignorance shields you from defection). EDIT: I just realized that my comment here makes absolutely no sense—just ignore it.
First, disambiguate the word “confidence”. There are numbers we call confidence—cold calculations that feed into other cold calculations. And then there’s an emotion we call confidence. These are not the same thing, and should not be subject to the same policies. To a first approximation, the numbers should be truthful and the emotion should be wildly over-optimistic.
So you disagree with Tordmor, and are arguing that we should never act like the Little Engine That Could by saying ‘I think I can’ to increase our credence in our likelihood of success, even if doing so helps us win more often? This is not totally implausible; I’ve heard of cases where not making Bayesian updates is the rational choice in game theoretic situations (where ignorance shields you from defection). EDIT: I just realized that my comment here makes absolutely no sense—just ignore it.