Our actions reveal what we actually want, not what we believe we want or believe we should want. No one chooses against their own judgment. What we do is choose against our understanding of our own judgment, and that is a far subtler matter.
More like, we have multiple parts of the brain which can reach different judgments, and rational arguments act on the wrong part. I think this is what you were getting at (and your subsequent examples seem to support that), but it should be explicit.
I think the rational (mostly linguistic) parts of our brain can influence decisions made by other parts, if we’re smart about it. The main trap seems to be that when the conscious and subconscious parts of our brain disagree, we may decide that we didn’t will hard enough. So we try to “will harder”, which to the linguistic part of our brain means sticking the word “very” in front of everything and generating a bunch of negative self views, which has the opposite of the desired effect on our subconscious.
More like, we have multiple parts of the brain which can reach different judgments, and rational arguments act on the wrong part. I think this is what you were getting at (and your subsequent examples seem to support that), but it should be explicit.
Exactly. I think the notion of modularity from evolutionary psychology would help to understand some types of akrasia. While consciousness is probably not a complete bystander per Annoyance, it’s merely one of the mental modules in the brain. This hypothesis explains Annoyance’s observation that there may be factors in our judgment that we don’t understand.
If we act against what we say we want, it may not mean that we “didn’t really want it,” but that one part of brain wanted it while another part didn’t, and the second part won.
“Want” is not always a unitary phenomenon inside the brain; neither is “judgment.”
Please note: I said consciousness is ALMOST without influence. It’s not completely so. The problem is that it attributes nearly everything we do it itself, instead of the few bits it actually contributes.
I think the rational (mostly linguistic) parts of our brain can influence decisions made by other parts, if we’re smart about it. The main trap seems to be that when the conscious and subconscious parts of our brain disagree, we may decide that we didn’t will hard enough. So we try to “will harder”, which to the linguistic part of our brain means sticking the word “very” in front of everything and generating a bunch of negative self views, which has the opposite of the desired effect on our subconscious.
IAWY; it’s either the opposite effect, or just no effect. You could also think of this as being a case of “the leader not listening to subordinates”, in that the “try harder” mode is ignoring whatever the actual problem is—i.e., the subconscious goal or prediction that’s interfering. In my experience it’s much more important to teach people to be able to listen to themselves (i.e., become aware of what they already believe/expect/desire) than to talk to themselves (i.e., push new information in).
Every time I try listening to myself, my subconscious invents me some new and “deep” explanation that I then actually believe for a day or two. It’s an endless quest.
A more fruitful strategy for me was taking some minutes or hours every day to grow something new in my mind, ignoring the old stuff completely. A couple times the new stuff in me eventually grew strong enough to overthrow the old stuff for control of my life without much struggle.
Every time I try listening to myself, my subconscious invents me some new and “deep” explanation that I then actually believe for a day or two. It’s an endless quest.
That’s not listening, and it’s not your subconscious. Your other-than-conscious mind doesn’t do explanations—heck, it doesn’t even grok abstractions, except for an intuitive (and biased) sense of probabilities for a given context (external+internal state).
The type of listening I’m referring to is paying attention to autonomous responses (e.g. a flash of people laughing at you if you fail), not making up theories or explanations. It’s harder to learn, but more worthwhile.
More like, we have multiple parts of the brain which can reach different judgments, and rational arguments act on the wrong part. I think this is what you were getting at (and your subsequent examples seem to support that), but it should be explicit.
I think the rational (mostly linguistic) parts of our brain can influence decisions made by other parts, if we’re smart about it. The main trap seems to be that when the conscious and subconscious parts of our brain disagree, we may decide that we didn’t will hard enough. So we try to “will harder”, which to the linguistic part of our brain means sticking the word “very” in front of everything and generating a bunch of negative self views, which has the opposite of the desired effect on our subconscious.
jimrandomh said:
Exactly. I think the notion of modularity from evolutionary psychology would help to understand some types of akrasia. While consciousness is probably not a complete bystander per Annoyance, it’s merely one of the mental modules in the brain. This hypothesis explains Annoyance’s observation that there may be factors in our judgment that we don’t understand.
If we act against what we say we want, it may not mean that we “didn’t really want it,” but that one part of brain wanted it while another part didn’t, and the second part won.
“Want” is not always a unitary phenomenon inside the brain; neither is “judgment.”
Please note: I said consciousness is ALMOST without influence. It’s not completely so. The problem is that it attributes nearly everything we do it itself, instead of the few bits it actually contributes.
IAWY; it’s either the opposite effect, or just no effect. You could also think of this as being a case of “the leader not listening to subordinates”, in that the “try harder” mode is ignoring whatever the actual problem is—i.e., the subconscious goal or prediction that’s interfering. In my experience it’s much more important to teach people to be able to listen to themselves (i.e., become aware of what they already believe/expect/desire) than to talk to themselves (i.e., push new information in).
Every time I try listening to myself, my subconscious invents me some new and “deep” explanation that I then actually believe for a day or two. It’s an endless quest.
A more fruitful strategy for me was taking some minutes or hours every day to grow something new in my mind, ignoring the old stuff completely. A couple times the new stuff in me eventually grew strong enough to overthrow the old stuff for control of my life without much struggle.
That’s not listening, and it’s not your subconscious. Your other-than-conscious mind doesn’t do explanations—heck, it doesn’t even grok abstractions, except for an intuitive (and biased) sense of probabilities for a given context (external+internal state).
The type of listening I’m referring to is paying attention to autonomous responses (e.g. a flash of people laughing at you if you fail), not making up theories or explanations. It’s harder to learn, but more worthwhile.