irrationality of unconscious mind continuing to pursue subgoals when clearly no longer connected to supergoals, unconscious’ vulnerability to proximity/scale biases when dealing with morality, and several others.
The conscious is guilty of these too.
I’ve read plenty about the unconscious, and I admit it’s astonishingly complex and capable. So are honeybees. But when bees perform unbelievably complicated tasks, I don’t assume they therefore have human-level intelligence, and I think the unconscious’ actions are more like the honeybees’ than people’s.
Okay, but carrying the analogy over, I’m sure you also don’t trivialize the value of honey!
However, if there’s something you think I should know more about, why not recommend me specific articles, authors, or books?
You could start with making yourself aware of the non-conscious mind’s ability to solve CAPTCHAs, an AI-complete problem, and current conscious minds’ inability to figure out how they do it with enough clarity to re-implement it in software.
Actually, it’s funny you mention CAPTCHAs as your example. If you’re going to go that far, why not also attribute skill at chess to the unconscious? After all, it’s got to be the unconscious that screens out most of the several dozen possible chess moves each turn and lets your conscious concentrate on the few best, and you can generalize from chess to practically all matters of strategy. Or for that matter, how about language? All my knowledge of English grammar was purely unconscious until I started studying the subject in high school, and 99% of my grammar use still comes from there.
So the issue’s not whether it can perform complex tasks. I don’t know exactly what the issue is, but I think it connects to the concept of “personhood” somehow. I question whether the unconscious is more than a collection of very sophisticated mental modules, in the same way that a bird’s brain may have a flight dynamics module, an astronomical navigation module, a mate-preference-analysis module, and so on.
The computing hardware of my brain contains a program for recognizing letters, a program that detects potential mates and responds with feelings of lust, a program that interacts with my reward system in such a way as to potentially create alcoholism, and so on. They’re all computationally very impressive. But I don’t see why I should assign them moral status any more than I would feel morally obligated to listen to a laptop on which I had installed a program that detected the presence of beautiful women nearby and then displayed the words “mate with this woman”. I don’t want to privilege these programs just because they happen to be located inside a human brain and they get reflected glory from some of the other things human brains can do.
To make me want to assign them moral status, you’d have to give me evidence that there was something that it felt like to be my lust. This seems kind of category-error-ish to me. I feel my lust, but my lust itself doesn’t feel anything. You may feel sorry for me for having to deal with my lust, but feeling sorry for my lust because I don’t choose to satisfy it is in my opinion a waste of sorrow. It’s also an infinite regress. If I feel unhappy because I have unfulfilled desire, and my desire feels unhappy because it’s unfulfilled, does my desire’s unhappiness feel something? Why stop there?
I have a feeling this problem requires more rigor than I can throw at it right now. I’ve been trying to think about it more clearly so as to hopefully eventually get some top-level posts out of it, but this is the best I can do at the moment.
I’ll bite the bullets in your first paragraph. So chess also relies on non-conscious skills. What trap did I just fall into?
I don’t see why I should assign them moral status any more than I would feel morally obligated to listen to a laptop …
There is a major difference between your unconscious mind and a laptop with the same output: specifically, the unconscious mind has a direct, seamless, high-bandwidth connection to your mind. When you recognize a face or a letter, you don’t have to pass it to a laptop, look at the output, and read the output. From your conscious mind’s perspective, you just get insta-recognition. This makes it more valuable that a laptop—in all senses—just as faster mental addition is better than a hand calculator that computes with the same speed.
If and when someone makes a machine that can do these tasks faster, and still interface seamlessly, in the unconscious’s stead, then you will be justified in trivializing the latter’s value. Just like you would feel less bad (though not completely indifferent) about the extinction of honeybees if honey could be more efficiently sythesized.
The only case where the above reasoning doens’t apply is, as you point out, in values. Why is the unconscious mind’s decision of values, er, valuable? Why are you morally bound to its decrees of lust? There answer is, I don’t know. But at the same time, I don’t know how you can clip out the lust while retaining “you”—not given your existing brain’s architecture. That is, I disagree that the brain is as modular as you seem to think, at least if that’s what you meant by the use of “modules”.
And remember, pure value judgments are only a small fraction of its outputs.
Re: I question whether the unconscious is more than a collection of very sophisticated mental modules, in the same way that a bird’s brain may have a flight dynamics module, an astronomical navigation module, a mate-preference-analysis module, and so on.
...and what do you think your conscious mind is, then—if not a collection of sophisticated mental modules?
Wikipedia gives Fodor’s list of eight characteristics of “mental modules”, which include “domain specificity”, “fast speed”, “shallow output”, “limited accessibility”, “encapsulation”, et cetera, and quotes someone else as saying the most important distinguishing feature is “cognitive impenetrability”.
In other words, “module” has a special definition that doesn’t mean exactly the same as “something in the mind”. So when I “accuse” the unconscious of being “modules”, all I’m saying is that it’s a bunch of single-purpose unlinked programs, as opposed to the generic and unified programs that make up the conscious mind. This seems relevant since it makes it harder to accept the idea of the unconscious as a separate but equal person living inside your brain.
If there are other definitions of “module” that include anything in the mind, and you’re using one of those, then yes, the conscious mind is a module or collection of modules as well.
In some respects, consciousness is largely a perceptual filter—the attention filter—whose role it is to block out most sensory inputs from most systems most of the time. From that perspective, the contents of consciousness primarily consist of the outputs of normally-unconscious modules. The bit of the mind that switches attention around might itself be relatively small—and gains the illusion of size by being able to illuminate many areas of the mind—by damping down perceptions from everywhere else.
Anyway, you might have a case that consciousness is somehow “less modular” than all the other parts of the mind.
This whole “identifying with consciousness” business is totally bizarre to me. I hate to come on with the self-help—but: consciousness is tiny! You are so much more than that! Please repeat to yourself 1,000 times—“I am not my conscious mind!” The idea that you are your consciousness is an illusion created by your ego—which thinks it is the most wonderful thing in the world—that everything revolves around it—and that it is you. If you get some perspective, you should be able to see what utter nonsense that is.
The bit of the mind that switches attention around might itself be relatively small—and gains the illusion of size by being able to illuminate many areas of the mind—by damping down perceptions from everywhere else.
And the PCT hypothesis for why this is so (predating the Society of Mind by a decade or so), is that consciousness is effectively the debugger or test rig for the rest of the brain: a tool whose job is the tuning, adjustment, and extension of the brain’s unconscious control systems. The conscious mind is heavily engaged in any sort of skill acquisition, “noticing” what perceptions are associated with success or failure, and this noticing process is key to wiring up new control circuits.
From this perspective, consciousness is effectively an on-call maintenance person, a tech support rep for the unconscious. Which provides a good evolutionary reason for “higher” animals to have higher degrees of consciousness; the more flexible the creature, the more advanced the tech support required. ;-)
That humans have decided to rebel and take over the company instead of functioning in a strictly support capacity is a separate issue.
And when the revolution isn’t going so well, we call it “akrasia”.
So the key to a smooth takeover is realizing that if the unconscious machinery isn’t working well, then you will suffer right along with your unconscious. You need a win-win solution, and the unconscious is pretty easily satisfied, being just a big dumb array of thermostats and all.
An array which—being that you’re its tech support rep—you can actually rewire. In fact, most of what’s in there, you consciously put there at some point, or at least didn’t object to.
But if you treat it like it’s an independent mind—which it isn’t—and an enemy (which it also isn’t) whose demands should be disregarded, then you’re never even going to perceive what is actually going on in there, and therefore won’t be able to tell how to change any of it. And you’ll just keep fighting, instead of debugging.
I think we agree. Your statement that the unconscious is “just a big dumb array of thermostats” is just what I was trying to get across, plus as you said that it isn’t an independent mind.
I interpreted Robin (I’m still not sure if I’m right) as suggesting the unconscious is a full and separate mind whose preferences deserve respect for the same reason you’d respect another human’s preferences. So that, for example, if you wanted to stay sober but your unconscious wanted to drink, you “owe” it to your unconscious to compromise, in the same way you’d be a bad friend if you didn’t take a friend’s preferences into account. All I am trying to say is that the unconscious doesn’t deserve that kind of respect.
If you’re saying that my conscious mind can achieve its own goals better by working with the unconscious in some particular way, well, you’re the expert on that and I believe you.
So that, for example, if you wanted to stay sober but your unconscious wanted to drink, you “owe” it to your unconscious to compromise, in the same way you’d be a bad friend if you didn’t take a friend’s preferences into account. All I am trying to say is that the unconscious doesn’t deserve that kind of respect.
If you’re saying that my conscious mind can achieve its own goals better by working with the unconscious in some particular way,
Yes. The reason I argued with your notion that you shouldn’t pay any attention to your unconscious goals is because, with relatively few exceptions, your unconscious goals are your goals.
Generally, they’re either re goals you share with your unconscious (like staying alive), or goals you put in there, based on what you thought was useful or valuable at some point in your life. Once such goals are acquired, any action patterns that lead towards those goals tend to stick until better action patterns are learned, or the goal is consciously deactivated.
But it isn’t enough to say, “I don’t want X any more”, when you don’t actually know what, precisely, X is. That’s why you actually do need to pay attention to your unconscious goals, so that you can either find alternative ways to satisfy them, or verify that in fact, you no longer require them to be satisfied on your behalf.
Think of it as a safety interlock of sorts, that allows you to maintain a sincere verbal belief and expression that you don’t want X, while leaving the machinery in place to nonetheless acquire X without your conscious knowledge or consent.
To borrow the metaphor of the Sirens, your unconscious won’t untie you from the mast until you stop fighting to get free. When you once more become the person who ordered yourself tied to the mast in the first place, then and ONLY then will your unconscious accept a reversal of your original orders.
That’s why you need to pay attention to the goals, so you can step into the mental shoes of the “you” who put the goals in in the first place, and then either reconsider the original goal, or find a better way to get it that doesn’t have side effects.
But unless you can actually acknowledge the desirability of the goal in question, your unconscious effectively assumes you’re merely under social pressure to demonstrate your desire to adhere to the ways of the tribe, and ignores your attempt to give it “new orders”.
This whole “identifying with consciousness” business is totally bizarre to me. I hate to come on with the self-help—but: consciousness is tiny! You are so much more than that! Please repeat to yourself 1,000 times—“I am not my conscious mind!” The idea that you are your consciousness is an illusion created by your ego—which thinks it is the most wonderful thing in the world—that everything revolves around it—and that it is you. If you get some perspective, you should be able to see what utter nonsense that is.
Sounds like an outside the box box. So I have a job interview tomorrow morning and my conscious mind is telling me to go to sleep early, but my unconscious keeps me up worrying and watching TV until midnight. Should I respect the secret wisdom of the unconscious mind that my deluded ego-self is keeping me from understanding, or should I shut up and figure out some way to get to sleep?
I like Buddhism. I meditate and I’m very interested in exploring the depths of my unconscious mind and possibly at some point dissolving my ego and achieving full awareness, whatever the heck that means. But the “unconscious” referred to in the original post is what’s telling the drunkard to get another shot of whiskey. I don’t think the Buddha would approve of that particular manifestation of it any more than anyone else, and all I’m saying is that this drunkard is justified in being against this desire, rather than thinking that since it’s their unconscious mind they have to accept it.
I feel like I already addressed such issues when I wrote: “We do not have to choose between these two theories.” Sometimes the conscious goals are best, and sometimes the unconscious ones are. You have given some examples of the former, but there are also examples of the latter.
The conscious is guilty of these too.
Okay, but carrying the analogy over, I’m sure you also don’t trivialize the value of honey!
You could start with making yourself aware of the non-conscious mind’s ability to solve CAPTCHAs, an AI-complete problem, and current conscious minds’ inability to figure out how they do it with enough clarity to re-implement it in software.
Actually, it’s funny you mention CAPTCHAs as your example. If you’re going to go that far, why not also attribute skill at chess to the unconscious? After all, it’s got to be the unconscious that screens out most of the several dozen possible chess moves each turn and lets your conscious concentrate on the few best, and you can generalize from chess to practically all matters of strategy. Or for that matter, how about language? All my knowledge of English grammar was purely unconscious until I started studying the subject in high school, and 99% of my grammar use still comes from there.
So the issue’s not whether it can perform complex tasks. I don’t know exactly what the issue is, but I think it connects to the concept of “personhood” somehow. I question whether the unconscious is more than a collection of very sophisticated mental modules, in the same way that a bird’s brain may have a flight dynamics module, an astronomical navigation module, a mate-preference-analysis module, and so on.
The computing hardware of my brain contains a program for recognizing letters, a program that detects potential mates and responds with feelings of lust, a program that interacts with my reward system in such a way as to potentially create alcoholism, and so on. They’re all computationally very impressive. But I don’t see why I should assign them moral status any more than I would feel morally obligated to listen to a laptop on which I had installed a program that detected the presence of beautiful women nearby and then displayed the words “mate with this woman”. I don’t want to privilege these programs just because they happen to be located inside a human brain and they get reflected glory from some of the other things human brains can do.
To make me want to assign them moral status, you’d have to give me evidence that there was something that it felt like to be my lust. This seems kind of category-error-ish to me. I feel my lust, but my lust itself doesn’t feel anything. You may feel sorry for me for having to deal with my lust, but feeling sorry for my lust because I don’t choose to satisfy it is in my opinion a waste of sorrow. It’s also an infinite regress. If I feel unhappy because I have unfulfilled desire, and my desire feels unhappy because it’s unfulfilled, does my desire’s unhappiness feel something? Why stop there?
I have a feeling this problem requires more rigor than I can throw at it right now. I’ve been trying to think about it more clearly so as to hopefully eventually get some top-level posts out of it, but this is the best I can do at the moment.
So’s your conscious. The unconscious just isn’t connected up the right way for deliberation and reflectivity.
(IAWYC)
I’ll bite the bullets in your first paragraph. So chess also relies on non-conscious skills. What trap did I just fall into?
There is a major difference between your unconscious mind and a laptop with the same output: specifically, the unconscious mind has a direct, seamless, high-bandwidth connection to your mind. When you recognize a face or a letter, you don’t have to pass it to a laptop, look at the output, and read the output. From your conscious mind’s perspective, you just get insta-recognition. This makes it more valuable that a laptop—in all senses—just as faster mental addition is better than a hand calculator that computes with the same speed.
If and when someone makes a machine that can do these tasks faster, and still interface seamlessly, in the unconscious’s stead, then you will be justified in trivializing the latter’s value. Just like you would feel less bad (though not completely indifferent) about the extinction of honeybees if honey could be more efficiently sythesized.
The only case where the above reasoning doens’t apply is, as you point out, in values. Why is the unconscious mind’s decision of values, er, valuable? Why are you morally bound to its decrees of lust? There answer is, I don’t know. But at the same time, I don’t know how you can clip out the lust while retaining “you”—not given your existing brain’s architecture. That is, I disagree that the brain is as modular as you seem to think, at least if that’s what you meant by the use of “modules”.
And remember, pure value judgments are only a small fraction of its outputs.
Re: I question whether the unconscious is more than a collection of very sophisticated mental modules, in the same way that a bird’s brain may have a flight dynamics module, an astronomical navigation module, a mate-preference-analysis module, and so on.
...and what do you think your conscious mind is, then—if not a collection of sophisticated mental modules?
Wikipedia gives Fodor’s list of eight characteristics of “mental modules”, which include “domain specificity”, “fast speed”, “shallow output”, “limited accessibility”, “encapsulation”, et cetera, and quotes someone else as saying the most important distinguishing feature is “cognitive impenetrability”.
In other words, “module” has a special definition that doesn’t mean exactly the same as “something in the mind”. So when I “accuse” the unconscious of being “modules”, all I’m saying is that it’s a bunch of single-purpose unlinked programs, as opposed to the generic and unified programs that make up the conscious mind. This seems relevant since it makes it harder to accept the idea of the unconscious as a separate but equal person living inside your brain.
If there are other definitions of “module” that include anything in the mind, and you’re using one of those, then yes, the conscious mind is a module or collection of modules as well.
The conscious mind is probably pretty modular too.
http://en.wikipedia.org/wiki/Society_of_Mind
In some respects, consciousness is largely a perceptual filter—the attention filter—whose role it is to block out most sensory inputs from most systems most of the time. From that perspective, the contents of consciousness primarily consist of the outputs of normally-unconscious modules. The bit of the mind that switches attention around might itself be relatively small—and gains the illusion of size by being able to illuminate many areas of the mind—by damping down perceptions from everywhere else.
Anyway, you might have a case that consciousness is somehow “less modular” than all the other parts of the mind.
This whole “identifying with consciousness” business is totally bizarre to me. I hate to come on with the self-help—but: consciousness is tiny! You are so much more than that! Please repeat to yourself 1,000 times—“I am not my conscious mind!” The idea that you are your consciousness is an illusion created by your ego—which thinks it is the most wonderful thing in the world—that everything revolves around it—and that it is you. If you get some perspective, you should be able to see what utter nonsense that is.
And the PCT hypothesis for why this is so (predating the Society of Mind by a decade or so), is that consciousness is effectively the debugger or test rig for the rest of the brain: a tool whose job is the tuning, adjustment, and extension of the brain’s unconscious control systems. The conscious mind is heavily engaged in any sort of skill acquisition, “noticing” what perceptions are associated with success or failure, and this noticing process is key to wiring up new control circuits.
From this perspective, consciousness is effectively an on-call maintenance person, a tech support rep for the unconscious. Which provides a good evolutionary reason for “higher” animals to have higher degrees of consciousness; the more flexible the creature, the more advanced the tech support required. ;-)
That humans have decided to rebel and take over the company instead of functioning in a strictly support capacity is a separate issue.
And when the revolution isn’t going so well, we call it “akrasia”.
So the key to a smooth takeover is realizing that if the unconscious machinery isn’t working well, then you will suffer right along with your unconscious. You need a win-win solution, and the unconscious is pretty easily satisfied, being just a big dumb array of thermostats and all.
An array which—being that you’re its tech support rep—you can actually rewire. In fact, most of what’s in there, you consciously put there at some point, or at least didn’t object to.
But if you treat it like it’s an independent mind—which it isn’t—and an enemy (which it also isn’t) whose demands should be disregarded, then you’re never even going to perceive what is actually going on in there, and therefore won’t be able to tell how to change any of it. And you’ll just keep fighting, instead of debugging.
Not really a good use of your time, IMO.
I think we agree. Your statement that the unconscious is “just a big dumb array of thermostats” is just what I was trying to get across, plus as you said that it isn’t an independent mind.
I interpreted Robin (I’m still not sure if I’m right) as suggesting the unconscious is a full and separate mind whose preferences deserve respect for the same reason you’d respect another human’s preferences. So that, for example, if you wanted to stay sober but your unconscious wanted to drink, you “owe” it to your unconscious to compromise, in the same way you’d be a bad friend if you didn’t take a friend’s preferences into account. All I am trying to say is that the unconscious doesn’t deserve that kind of respect.
If you’re saying that my conscious mind can achieve its own goals better by working with the unconscious in some particular way, well, you’re the expert on that and I believe you.
Yes. The reason I argued with your notion that you shouldn’t pay any attention to your unconscious goals is because, with relatively few exceptions, your unconscious goals are your goals.
Generally, they’re either re goals you share with your unconscious (like staying alive), or goals you put in there, based on what you thought was useful or valuable at some point in your life. Once such goals are acquired, any action patterns that lead towards those goals tend to stick until better action patterns are learned, or the goal is consciously deactivated.
But it isn’t enough to say, “I don’t want X any more”, when you don’t actually know what, precisely, X is. That’s why you actually do need to pay attention to your unconscious goals, so that you can either find alternative ways to satisfy them, or verify that in fact, you no longer require them to be satisfied on your behalf.
Think of it as a safety interlock of sorts, that allows you to maintain a sincere verbal belief and expression that you don’t want X, while leaving the machinery in place to nonetheless acquire X without your conscious knowledge or consent.
To borrow the metaphor of the Sirens, your unconscious won’t untie you from the mast until you stop fighting to get free. When you once more become the person who ordered yourself tied to the mast in the first place, then and ONLY then will your unconscious accept a reversal of your original orders.
That’s why you need to pay attention to the goals, so you can step into the mental shoes of the “you” who put the goals in in the first place, and then either reconsider the original goal, or find a better way to get it that doesn’t have side effects.
But unless you can actually acknowledge the desirability of the goal in question, your unconscious effectively assumes you’re merely under social pressure to demonstrate your desire to adhere to the ways of the tribe, and ignores your attempt to give it “new orders”.
Sounds like an outside the box box. So I have a job interview tomorrow morning and my conscious mind is telling me to go to sleep early, but my unconscious keeps me up worrying and watching TV until midnight. Should I respect the secret wisdom of the unconscious mind that my deluded ego-self is keeping me from understanding, or should I shut up and figure out some way to get to sleep?
I like Buddhism. I meditate and I’m very interested in exploring the depths of my unconscious mind and possibly at some point dissolving my ego and achieving full awareness, whatever the heck that means. But the “unconscious” referred to in the original post is what’s telling the drunkard to get another shot of whiskey. I don’t think the Buddha would approve of that particular manifestation of it any more than anyone else, and all I’m saying is that this drunkard is justified in being against this desire, rather than thinking that since it’s their unconscious mind they have to accept it.
I feel like I already addressed such issues when I wrote: “We do not have to choose between these two theories.” Sometimes the conscious goals are best, and sometimes the unconscious ones are. You have given some examples of the former, but there are also examples of the latter.
Sorry, long time ago and different section of the comments. I think with that clarified I mostly agree with you anyway.