A different sort of reason for not updating all the way:
When there isn’t a time-dependence of the sort exhibited by the OP’s example, or a steady flow of genuine new information that just happens all to be pointing the same way, a steady trend probably is an indication that you’re deviating from perfect rationality. But the direction you’re deviating in could be “repeatedly updating on actually-not-independent information” rather than “failing to notice a trend”, and in that case the correct thing to do is to roll back some of your updating rather than guessing where it’s leading to and jumping there.
For instance, suppose there’s some amount of evidence for X, and different people discover it at different times, and you keep hearing people one after another announce that they’ve started believing X. You don’t hear what the original evidence is, and maybe it’s in some highly technical field you aren’t expert in so you couldn’t evaluate it anyway; all you know is that a bunch of people are starting to believe it. In this scenario, it’s correct for each new conversion to push you a little way in the direction towards believing X yourself, but the chances are that your updates will not be perfectly calibrated to converge on what your Pr(X) would be if you understood the actual structure of what’s going on. They might well overshoot, and in that case extrapolating your belief-trajectory and jumping to the apparent endpoint would move you the wrong way.
A different sort of reason for not updating all the way:
When there isn’t a time-dependence of the sort exhibited by the OP’s example, or a steady flow of genuine new information that just happens all to be pointing the same way, a steady trend probably is an indication that you’re deviating from perfect rationality. But the direction you’re deviating in could be “repeatedly updating on actually-not-independent information” rather than “failing to notice a trend”, and in that case the correct thing to do is to roll back some of your updating rather than guessing where it’s leading to and jumping there.
For instance, suppose there’s some amount of evidence for X, and different people discover it at different times, and you keep hearing people one after another announce that they’ve started believing X. You don’t hear what the original evidence is, and maybe it’s in some highly technical field you aren’t expert in so you couldn’t evaluate it anyway; all you know is that a bunch of people are starting to believe it. In this scenario, it’s correct for each new conversion to push you a little way in the direction towards believing X yourself, but the chances are that your updates will not be perfectly calibrated to converge on what your Pr(X) would be if you understood the actual structure of what’s going on. They might well overshoot, and in that case extrapolating your belief-trajectory and jumping to the apparent endpoint would move you the wrong way.
Yep, that’s another excellent reason to beware of.