I think in many cases this is a feature rather than a bug, I’d be curious as to whether you had an example of a time when having made the prediction would be deleterious to realising the desired outcome? Assuming as Adam says that you’re more interested in the outcome than the prediction being accurate.
Assigning a low probability that I will do a task in time is a self-fulfilling prophecy. Because the expected utility (probability times utility) is low, the motivation to do the task decreases. Ideally I would never assign probabilities to acts when choosing what to do, and only compare their utilities.
I think in many cases this is a feature rather than a bug, I’d be curious as to whether you had an example of a time when having made the prediction would be deleterious to realising the desired outcome? Assuming as Adam says that you’re more interested in the outcome than the prediction being accurate.
Assigning a low probability that I will do a task in time is a self-fulfilling prophecy. Because the expected utility (probability times utility) is low, the motivation to do the task decreases. Ideally I would never assign probabilities to acts when choosing what to do, and only compare their utilities.