Thanks. Some of these seem to be good ideas indeed.
Isn’t it an issue though that in a lot of these cases you/your colleagues have large direct impact on the outcome, and knowing the prediction itself can change your/their behaviour? E.g., if you have 90% public prediction that Adam will complete task X and 10% that Bob will complete Y, their knowledge of these can impact how they approach their work. Maybe Bob will .
That being said, it can still be an effective productivity tool, but it no longer directly measures the outcome of the stated prediction. Instead, it will become a reflective statement (“The probability of Bob completing Y provided he knows this prediction is 10%”).
I think in many cases this is a feature rather than a bug, I’d be curious as to whether you had an example of a time when having made the prediction would be deleterious to realising the desired outcome? Assuming as Adam says that you’re more interested in the outcome than the prediction being accurate.
Assigning a low probability that I will do a task in time is a self-fulfilling prophecy. Because the expected utility (probability times utility) is low, the motivation to do the task decreases. Ideally I would never assign probabilities to acts when choosing what to do, and only compare their utilities.
I think it’s still very useful to be able to predict your own behaviour (including in the case where you know you’ve made a prediction about it).
Things can get weird if you care more about the outcome of the prediction than the outcome of the event in itself, but this should rarely be the case—and is worth avoiding, I think.
Thanks. Some of these seem to be good ideas indeed.
Isn’t it an issue though that in a lot of these cases you/your colleagues have large direct impact on the outcome, and knowing the prediction itself can change your/their behaviour? E.g., if you have 90% public prediction that Adam will complete task X and 10% that Bob will complete Y, their knowledge of these can impact how they approach their work. Maybe Bob will .
That being said, it can still be an effective productivity tool, but it no longer directly measures the outcome of the stated prediction. Instead, it will become a reflective statement (“The probability of Bob completing Y provided he knows this prediction is 10%”).
I think in many cases this is a feature rather than a bug, I’d be curious as to whether you had an example of a time when having made the prediction would be deleterious to realising the desired outcome? Assuming as Adam says that you’re more interested in the outcome than the prediction being accurate.
Assigning a low probability that I will do a task in time is a self-fulfilling prophecy. Because the expected utility (probability times utility) is low, the motivation to do the task decreases. Ideally I would never assign probabilities to acts when choosing what to do, and only compare their utilities.
I think it’s still very useful to be able to predict your own behaviour (including in the case where you know you’ve made a prediction about it).
Things can get weird if you care more about the outcome of the prediction than the outcome of the event in itself, but this should rarely be the case—and is worth avoiding, I think.