This clarified something for me, so I appreciate the discussion!
Changing the model is like a phase transition cost—any change in the state will require some cost, whether it is simplifying or complexifying the model. So you’re right that I have no clear reason to think that “adding information to a mental model should be more expensive than deleting information”—though it’s an interesting question.
But the cost of utilizing a model, as opposed to modifying it, is always lower with lower entropy models. A meta-model with uncertainty about which of 3 possible models to use is necessarily more complex to compute the result of than any of the 3 underlying models.Minds cannot have arbitrarily complex representations that they constantly use, because it would take forever to decide anything.
Your last point, about mental stability, is not clear to me yet—if you want to expand on it,. that would be great.
Take a limiting belief that a lot of people hold:
“I shouldn’t have to ask my partner for what I want.”
What happens if you suddenly drop that believe? You find yourself in the challenging situation of putting what you want into words. You actually have to admit to your desires. It might hurt more when you ask explicitly for what you want and your partner then doesn’t give it to you.
That’s all uncertainty.
Imagine that a person get’s rid of the belief “I don’t deserve to be happy.” Getting rid of such a belief has actual consequences.
Both of those beliefs aren’t one’s I hold, but there are tons of beliefs in that category that still linger in my mind.
Deleting 100 of them could change a lot and produce a lot of chaos.
Those are changing your beliefs in a complex situation. But you’re not simplifying any model—you are substituting one model for another; “I shouldn’t have to ask my partner for what I want” is replaced by “I should have to ask my partner for what I want.” There is no reason to think that that is a lower entropy model—it’s just a new model instead of the old one. The question is whether the model “I’m uncertain whether I need to ask my partner for what I want or not” is more or less complex than “I shouldn’t have to ask my partner for what I want.”
you are substituting one model for another; “I shouldn’t have to ask my partner for what I want” is replaced by “I should have to ask my partner for what I want.”
Given my experiences with working with beliefs I don’t think that’s the case.
Unfortunately most of the evidence I have for that is personal experience and talking to other people with personal experience so I can’t link you to a resource making that point.
This clarified something for me, so I appreciate the discussion!
Changing the model is like a phase transition cost—any change in the state will require some cost, whether it is simplifying or complexifying the model. So you’re right that I have no clear reason to think that “adding information to a mental model should be more expensive than deleting information”—though it’s an interesting question.
But the cost of utilizing a model, as opposed to modifying it, is always lower with lower entropy models. A meta-model with uncertainty about which of 3 possible models to use is necessarily more complex to compute the result of than any of the 3 underlying models.Minds cannot have arbitrarily complex representations that they constantly use, because it would take forever to decide anything.
Your last point, about mental stability, is not clear to me yet—if you want to expand on it,. that would be great.
Take a limiting belief that a lot of people hold: “I shouldn’t have to ask my partner for what I want.”
What happens if you suddenly drop that believe? You find yourself in the challenging situation of putting what you want into words. You actually have to admit to your desires. It might hurt more when you ask explicitly for what you want and your partner then doesn’t give it to you. That’s all uncertainty.
Imagine that a person get’s rid of the belief “I don’t deserve to be happy.” Getting rid of such a belief has actual consequences.
Both of those beliefs aren’t one’s I hold, but there are tons of beliefs in that category that still linger in my mind. Deleting 100 of them could change a lot and produce a lot of chaos.
Those are changing your beliefs in a complex situation. But you’re not simplifying any model—you are substituting one model for another; “I shouldn’t have to ask my partner for what I want” is replaced by “I should have to ask my partner for what I want.” There is no reason to think that that is a lower entropy model—it’s just a new model instead of the old one. The question is whether the model “I’m uncertain whether I need to ask my partner for what I want or not” is more or less complex than “I shouldn’t have to ask my partner for what I want.”
Given my experiences with working with beliefs I don’t think that’s the case.
Unfortunately most of the evidence I have for that is personal experience and talking to other people with personal experience so I can’t link you to a resource making that point.