I think a basic possible concern (but likely minor in the long term, given sufficient rationality training and experience) is that making a lot of explicit predictions about things you have causal control over can have a self-fulfilling prophecy effect. So, for example, if you expect you won’t be able to accomplish something, that belief could propagate through your mind and subconsciously make you not try as hard to achieve it (because you would expect you ultimately wouldn’t be able to anyway), which later on makes your subjective estimation of the likelihood of your success go lower, so you try even less hard, etc. A self-reinforcing loop of anti-productivity.
I think a basic possible concern (but likely minor in the long term, given sufficient rationality training and experience) is that making a lot of explicit predictions about things you have causal control over can have a self-fulfilling prophecy effect. So, for example, if you expect you won’t be able to accomplish something, that belief could propagate through your mind and subconsciously make you not try as hard to achieve it (because you would expect you ultimately wouldn’t be able to anyway), which later on makes your subjective estimation of the likelihood of your success go lower, so you try even less hard, etc. A self-reinforcing loop of anti-productivity.