So you disagree with Tordmor, and are arguing that we should never act like the Little Engine That Could by saying ‘I think I can’ to increase our credence in our likelihood of success, even if doing so helps us win more often? This is not totally implausible; I’ve heard of cases where not making Bayesian updates is the rational choice in game theoretic situations (where ignorance shields you from defection). EDIT: I just realized that my comment here makes absolutely no sense—just ignore it.
So you disagree with Tordmor, and are arguing that we should never act like the Little Engine That Could by saying ‘I think I can’ to increase our credence in our likelihood of success, even if doing so helps us win more often? This is not totally implausible; I’ve heard of cases where not making Bayesian updates is the rational choice in game theoretic situations (where ignorance shields you from defection). EDIT: I just realized that my comment here makes absolutely no sense—just ignore it.