My immediate thoughts ( Apologies if they are incoherent): The predictability of belief updating could be due in part to what qualifies as “updating”. In the examples given, belief updating seemed to happen when new information was presented. However, I’m not sure that models how we think.
What if, “belief updating” is compounded at some interval, and in the absence of new information, old beliefs, when “updated” don’t actually tend to change? Every moment you believe something even in the absence of new information, would qualify as a moment of updating ones beliefs
So, if you believe that a coin has a 50% chance to be weighted. Perhaps, it matters if one waits for one hour before flipping the coin VS if one flips the coin immediately. After all, It’s not unreasonable, that if you believe something for a long time and are never proven false, you assign a higher certainty than something you just started believing One minute ago. I understand that in my example above this is an unreasonable conclusion, but I believe it holds as valid in cases where we might expect, counter-examples to have made themselves known to us over time, ( if they existed).
This could explain Polarization in a very natural lens, it’s just a feature of the fact that being presented with new information is rare, however we still “update” our beliefs, even when not presented with new information, and these updated beliefs hold more weight than as if we adopted them for the first time.
So, it could be that when we update a belief, we add in some weight factor, dependent on many times we have updated the same belief in the past. So, it’s not that current probability equals current expectation of your future probability.
But that,
Current Probability + Past Held Probabilities ( which are equivalent to Current, or at least non-exclusive) = current expectation of future probability
The amount of Past Held Probabilities, to be considered could have to do with the frequency at which one updates their positions. As this would vary, it would explain why Polarization varies, and would even predict that people who hold a belief, and thinks about them frequently ( which we might use an indicator of a belief updating more frequently) but don’t encounter any new information or don’t question/update their position.… have a higher risk of Polarization. Which seems self evident, and I hope is an indicator that my proposed mathematics, is reasonable.
My immediate thoughts ( Apologies if they are incoherent): The predictability of belief updating could be due in part to what qualifies as “updating”. In the examples given, belief updating seemed to happen when new information was presented. However, I’m not sure that models how we think.
What if, “belief updating” is compounded at some interval, and in the absence of new information, old beliefs, when “updated” don’t actually tend to change? Every moment you believe something even in the absence of new information, would qualify as a moment of updating ones beliefs
So, if you believe that a coin has a 50% chance to be weighted. Perhaps, it matters if one waits for one hour before flipping the coin VS if one flips the coin immediately. After all, It’s not unreasonable, that if you believe something for a long time and are never proven false, you assign a higher certainty than something you just started believing One minute ago. I understand that in my example above this is an unreasonable conclusion, but I believe it holds as valid in cases where we might expect, counter-examples to have made themselves known to us over time, ( if they existed).
This could explain Polarization in a very natural lens, it’s just a feature of the fact that being presented with new information is rare, however we still “update” our beliefs, even when not presented with new information, and these updated beliefs hold more weight than as if we adopted them for the first time.
So, it could be that when we update a belief, we add in some weight factor, dependent on many times we have updated the same belief in the past. So, it’s not that current probability equals current expectation of your future probability.
But that,
Current Probability + Past Held Probabilities ( which are equivalent to Current, or at least non-exclusive) = current expectation of future probability
The amount of Past Held Probabilities, to be considered could have to do with the frequency at which one updates their positions. As this would vary, it would explain why Polarization varies, and would even predict that people who hold a belief, and thinks about them frequently ( which we might use an indicator of a belief updating more frequently) but don’t encounter any new information or don’t question/update their position.… have a higher risk of Polarization. Which seems self evident, and I hope is an indicator that my proposed mathematics, is reasonable.