Your memory eventually drives confidence in each hypothesis to 1 or 0
Our memory tends to contain less and less information. We forget certain things, and our memory about others become simplified, and a complex article boils down to “X is bad, Y is good, try to do better”.
One unexpected consequence of this is how it impacts our sense of probability: to describe the probability binary, it only requires 1 bit of information, but to describe the probability as a percentage, you need a lot more bits!
Because of this, a person’s confidence in the most likely hypothesis in his opinion tends to 1 over time, and the probability of all other hypotheses that he had time to think about tends to 0. Every time he remembers a hyp, a person will less and less often remember the nuance that “the probability of hyp A = X”, he will simply remember “And this is the truth (that mean the probability of A = 100%)”.
(Edit: it works this way for some people only. Others understand that just because they remember something as “true,” it doesn’t mean their past self believed it with 100% certainty. I wrote this article to everyone do like “others”)
Another way certainty in a hyp may grow over time is when we forget its weak spots—the things we should think about to test it.
I’ve often found myself in a cycle like this: I study a hyp and feel, “I’m not entirely sure, but it seems true.” Later, I forget about this feeling, and eventually, I start thinking of the hypothesis as true, almost like the gravity.
Therefore, just after thinking about a hyp, you should write down your confidence and parts you are not sure about. Also, you should to remember this bias to notice it, when you use a hypothesis you thought a lot of time ago.
TL;DR: memory is bad, writing is good, try to do better.
Feels empirically true. I remember cases where I thought about a memory X and was initially uncertain about some aspect of it, but then when I think about it later it feels either true or false in my memory, so I have to be like, “well no, I know that I was 50⁄50 on this when the memory was more recent, so that’s what my probability should be now, even though it doesn’t feel like it anymore”.
Seems like the fact that I put about a 50% probability on the thing survived (easy/clear enough to remember), but the reasons did not, so the probability no longer feels accurate.
One bit could also encode “probably true” and “probably false”. It doesn’t have to be “certainly true” and “certainly false”. And this is of course what we observe. We aren’t perfectly certain in everything we can barely remember to be true.
With metaculus I do write my confidence in pretty much every forecast that I make. It has made this better. But even then, when coming back to long term forecasts, I sometimes find myself stunned by my specific %, likely forgetting some of the informations that brought me to that forecast in the first place.
But if I only remember the most significant bit, I am going to treat it more like 25%/75% as opposed to 0⁄1
So if one day you decided that P of X ≈ 1, you would remember “it’s true but I’m not sure” after one year?
If I only have 1 bit of memory space, and the probabilities I am remembering are uniformly distributed from 0 to 1, then the best I can do is remember if the chance is > 1⁄2.
And then a year later, all I know is that the chance is >1/2, but otherwise uniform. So average value is 3⁄4.
The limited memory does imply lower performance than unlimited memory.
And yes, when was in a pub quiz, I was going “I think it’s this option, but I’m not sure” quite a lot.