There’s significant ambiguity about what counts as “changing” a belief. If you look at belief in the only way that’s rational—that is, as coming in degrees—then you “change” your belief whenever you alter subjective probability. Your examples suggest that you’re defining belief change as binary. I think people’s subjective probabilities change all the time, but you rarely see a complete flip-flop, for good reason: significant beliefs often rest on vast evidence, which one new piece of evidence, no matter how striking, won’t be apt to reverse the direction of (in binary terms).
From this Bayesian standpoint, the over-correction advice (e.g., reveling in propaganda) are misguided, because overshooting isn’t harmless; you can make yourself epistemically worse off (although this won’t be evident in a binary model). For example, if you start with a likelihood estimate of .4 and obtain evidence that should move you to .6, but instead you over-correct and end up at .9, you end up more wrong than before, although this territory is concealed when you use a binary map.
You’re discussing, I think, mainly far-mode beliefs. Near-mode beliefs change all the time. (Any time you try to solve an equation and try one solution and abandon it for another, you’ve changed your belief.) Far-mode beliefs resist change because far mode evolved as a means of deceptive signaling. (H/T Robin Hanson) As such they are less responsive to evidence and more responsive to status concerns, e.g., appearing consistent.
The only way to change a far-mode tendency is on its own terms, by changing what you are committed to signaling. Rather than signaling you are a consistent person who’s always right the first time, you turn yourself into a person who’s inclined to signal that he’s an open-minded person, responsive to the evidence and capable of learning from mistakes.
From this Bayesian standpoint, the over-correction advice is misguided, because overshooting isn’t harmless; you can make yourself epistemically worse off.
If you update on the evidence from .3 to .5, and then later evidence shows you that you still act as if you believe the probability is .3, then you should consider irrationally changing your belief. Of course, you risk overupdating to 0.7 or 0.9, but that is a question for expected utility, not a point against the concept entire.
It may be possible to be more accurate with over-correction than simply pushing in the right direction; inundate yourself with “0.5” propaganda or something.
There’s significant ambiguity about what counts as “changing” a belief. If you look at belief in the only way that’s rational—that is, as coming in degrees—then you “change” your belief whenever you alter subjective probability. Your examples suggest that you’re defining belief change as binary. I think people’s subjective probabilities change all the time, but you rarely see a complete flip-flop, for good reason: significant beliefs often rest on vast evidence, which one new piece of evidence, no matter how striking, won’t be apt to reverse the direction of (in binary terms).
From this Bayesian standpoint, the over-correction advice (e.g., reveling in propaganda) are misguided, because overshooting isn’t harmless; you can make yourself epistemically worse off (although this won’t be evident in a binary model). For example, if you start with a likelihood estimate of .4 and obtain evidence that should move you to .6, but instead you over-correct and end up at .9, you end up more wrong than before, although this territory is concealed when you use a binary map.
You’re discussing, I think, mainly far-mode beliefs. Near-mode beliefs change all the time. (Any time you try to solve an equation and try one solution and abandon it for another, you’ve changed your belief.) Far-mode beliefs resist change because far mode evolved as a means of deceptive signaling. (H/T Robin Hanson) As such they are less responsive to evidence and more responsive to status concerns, e.g., appearing consistent.
The only way to change a far-mode tendency is on its own terms, by changing what you are committed to signaling. Rather than signaling you are a consistent person who’s always right the first time, you turn yourself into a person who’s inclined to signal that he’s an open-minded person, responsive to the evidence and capable of learning from mistakes.
If you update on the evidence from .3 to .5, and then later evidence shows you that you still act as if you believe the probability is .3, then you should consider irrationally changing your belief. Of course, you risk overupdating to 0.7 or 0.9, but that is a question for expected utility, not a point against the concept entire.
It may be possible to be more accurate with over-correction than simply pushing in the right direction; inundate yourself with “0.5” propaganda or something.