Biases of Intuitive and Logical Thinkers
Any intuition-dominant thinker who’s struggled with math problems or logic-dominant thinker who’s struggled with small-talk knows how difficult and hopeless the experience feels like. For a long time I was an intuition thinker, then I developed a logical thinking style and soon it ended up dominating—granting me the luxury of experiencing both kinds of struggles. I eventually learned to apply the thinking style better optimized for the problem I was facing. Looking back, I realized why I kept sticking to one extreme.
I hypothesize that one-sided thinkers develop biases and tendencies that prevent them from improving their weaker mode of thinking. These biases cause a positive feedback loop that further skews thinking styles in the same direction.
The reasons why one style might be overdeveloped and the other underdeveloped vary greatly. Genes have a strong influence, but environment also plays a large part. A teacher may have inspired you to love learning science at a young age, causing you to foster to a thinking style better for learning science. Or maybe you grew up very physically attractive and found socializing with your peers a lot more rewarding than studying after school, causing you to foster a thinking style better for navigating social situations. Environment can be changed to help develop certain thinking styles, but it should be supplementary to exposing and understanding the biases you already have. Entering an environment that penalizes your thinking style can be uncomfortable, stressful and frustrating without being prepared. (Such a painful experience is part of why these biases cause a positive feedback loop, by making us avoid environments that require the opposite thinking style.)
Despite genetic predisposition and environmental circumstances, there’s room for improvement and exposing these biases and learning to account for them is a great first step.
Below is a list of a few biases that worsen our ability to solve a certain class of problems and keep us from improving our underdeveloped thinking style.
Intuition-dominant Biases
Overlooking crucial details
Details matter in order to understand technical concepts. Overlooking a word or sentence structure can cause complete misunderstanding—a common blunder for intuition thinkers.
Intuition is really good at making fairly accurate predictions without complete information, enabling us to navigate the world without having a deep understanding of it. As a result, intuition trains us to experience the feeling we understand something without examining every detail. In most situations, paying close attention to detail is unnecessary and sometimes dangerous. When learning a technical concept, every detail matters and the premature feeling of understanding stops us from examining them.
This bias is one that’s more likely to go away once you realize it’s there. You often don’t know what details you’re missing after you’ve missed them, so merely remembering that you tend to miss important details should prompt you to take closer examinations in the future.
Expecting solutions to sound a certain way
The Internship has a great example of this bias (and a few others) in action. The movie is about two middle-aged unemployed salesmen (intuition thinkers) trying to land an internship with Google. Part of Google’s selection process has the two men participate in several technical challenges. One challenge required the men and their team to find a software bug. In a flash of insight, Vince Vaughn’s character, Billy, shouts “Maybe the answer is in the question! Maybe it has something to do with the word bug. A fly!” After enthusiastically making several more word associations, he turns to his team and insists they take him seriously.
Why is it believable to the audience that Billy can be so confident about his answer?
Billy’s intuition made an association between the challenge question and riddle-like questions he’s heard in the past. When Billy used his intuition to find a solution, his confidence in a riddle-like answer grew. Intuition recklessly uses irrelevant associations as reasons for narrowing down the space of possible solutions to technical problems. When associations pop in your mind, it’s a good idea to legitimize those associations with supporting reasons.
Not recognizing precise language
Intuition thinkers are multi-channel learners—all senses, thoughts and emotions are used to construct a complex database of clustered knowledge to predict and understand the world. With robust information-extracting ability, correct grammar/word-usage is, more often than not, unnecessary for meaningful communication.
Communicating technical concepts in a meaningful way requires precise language. Connotation and subtext are stripped away so words and phrases can purely represent meaningful concepts inside a logical framework. Intuition thinkers communicate with imprecise language, gathering meaning from context to compensate. This makes it hard for them to recognize when to turn off their powerful information extractors.
This bias explains part of why so many intuition thinkers dread math “word problems”. Introducing words and phrases rich with meaning and connotation sends their intuition running wild. It’s hard for them to find correspondences between words in the problem and variables in the theorems and formulas they’ve learned.
The noise intuition brings makes it hard to think clearly. It’s hard for intuition thinkers to tell whether their automatic associations should be taken seriously. Without a reliable way to discern, wrong interpretations of words go undetected. For example, without any physics background, an intuition thinker may read the statement “Matter can have both wave and particle properties at once” and believe they completely understand it. Unrelated associations of what matter, wave and particle mean, blindly take precedence over technical definitions.
The slightest uncertainty about what a sentence means should raise a red flag. Going back and finding correspondence between each word and how it fits into a technical framework will eliminate any uncertainty.
Believing their level of understanding is deeper than what it is
Intuition works on an unconscious level, making intuition thinkers unaware of how they know what they know. Not surprisingly, their best tool to learn what it means to understand is intuition. The concept “understanding” is a collection of associations from experience. You may have learned that part of understanding something means being able to answer questions on a test with memorized factoids, or knowing what to say to convince people you understand, or just knowing more facts than your friends. These are not good methods for gaining a deep understanding of technical concepts.
When intuition thinkers optimize for understanding, they’re really optimizing for a fuzzy idea of what they think understanding means. This often leaves them believing they understand a concept when all they’ve done is memorize some disconnected facts. Not knowing what it feels like to have deeper understanding, they become conditioned to always expect some amount of surprise. They can feel max understanding with less confidence than logical thinkers when they feel max understanding. This lower confidence disincentivizes intuition thinkers to invest in learning technical concepts, further keeping their logical thinking style underdeveloped.
One way I overcame this tendency was to constantly ask myself “why” questions, like a curious child bothering their parents. The technique helped me uncover what used to be unknown unknowns that made me feel overconfident in my understanding.
Logic-dominant Biases
Ignoring information they cannot immediately fit into a framework
Logical thinkers have and use intuition—problem is they don’t feed it enough. They tend to ignore valuable intuition-building information if it doesn’t immediately fit into a predictive model they deeply understand. While intuition thinkers don’t filter out enough noise, logical thinkers filter too much.
For example, if a logical thinker doesn’t have a good framework for understanding human behavior, they’re more likely to ignore visual input like body language and fashion, or auditory input like tone of voice and intonation. Human behavior is complicated, there’s no framework to date that can make perfectly accurate predictions about it. Intuition can build powerful models despite working with many confounding variables.
Bayesian probability enables logical thinkers to build predictive models from noisy data without having to use intuition. But even then, the first step of making a Bayesian update is data collection.
Combatting this tendency requires you to pay attention to input you normally ignore. Supplement your broader attentional scope with a researched framework as a guide. Say you want to learn how storytelling works. Start by grabbing resources that teach storytelling and learn the basics. Out in the real-world, pay close attention to sights, sounds, and feelings when someone starts telling a story and try identifying sensory input to the storytelling elements you’ve learned about. Once the basics are subconsciously picked up by habit, your conscious attention will be freed up to make new and more subtle observations.
Ignoring their emotions
Emotional input is difficult to factor, especially because you’re emotional at the time. Logical thinkers are notorious for ignoring this kind of messy data, consequently starving their intuition of emotional data. Being able to “go with your gut feelings” is a major function of intuition that logical thinkers tend to miss out on.
Your gut can predict if you’ll get along long-term with a new SO, or what kind of outfit would give you more confidence in your workplace, or if learning tennis in your free time will make you happier, or whether you prefer eating a cheeseburger over tacos for lunch. Logical thinkers don’t have enough data collected about their emotions to know what triggers them. They tend to get bogged down and mislead with objective, yet trivial details they manage to factor out. A weak understanding of their own emotions also leads to a weaker understanding of other’s emotions. You can become a better empathizer by better understanding yourself.
You could start from scratch and build your own framework, but self-assessment biases will impede productivity. Learning an existing framework is a more realistic solution. You can find resources with some light googling and I’m sure CFAR teaches some good ones too. You can improve your gut feelings too. One way is making sure you’re always consciously aware of the circumstances you’re in when experiencing an emotion.
Making rules too strict
Logical thinkers build frameworks in order to understand things. When adding a new rule to a framework, there’s motivation to make the rule strict. The stricter the rule, the more predictive power, the better the framework. When the domain you’re trying to understand has multivariable chaotic phenomena, strict rules are likely to break. The result is something like the current state of macroeconomics: a bunch of logical thinkers preoccupied by elegant models and theories that can only exist when useless in practice.
Following rules that are too strict can have bad consequences. Imagine John the salesperson is learning how to make better first impressions and has built a rough framework so far. John has a rule that smiling always helps make people feel welcomed the first time they meet him. One day he makes a business trip to Russia to meet with a prospective client. The moment he meet his russian client, he flashes a big smile and continues to smile despite negative reactions. After a few hours of talking, his client reveals she felt he wasn’t trustworthy at first and almost called off the meeting. Turns out that in Russia smiling to strangers is a sign of insincerity. John’s strict rule didn’t account for cultural differences, blindsiding him from updating on his clients reaction, putting him in a risky situation.
The desire to hold onto strict rules can make logical thinkers susceptible to confirmation bias too. If John made an exception to his smiling rule, he’d feel less confident about his knowledge of making first impressions, subsequently making him feel bad. He may also have to amend some other rule that relates to the smiling rule, which would further hurt his framework and his feelings.
When feeling the urge to add on a new rule, take note of circumstances in which the evidence for the rule was found in. Add exceptions that limit the rule’s predictive power to similar circumstances. Another option is to entertain multiple conflicting rules simultaneously, shifting weight from one to the other after gathering more evidence.
Anyone have more biases/tendencies to add?
I’ve built a trap for myself to help mitigate this tendency:
As soon as I think I understand something, I try it.
I.e., if I’m reading a book about circuit diagrams, the moment my intuition clicks in my head and says “aha! This is how a NAND gate works!”, I immediately tell that part of my brain “okay, if you’re so damn smart, build one.” If I’m studying linear algebra, the moment the intuition clicks in my head and says “aha! That’s how an affine transformation works!”, I immediately tell that part of my brain “great! let’s skip to the problems section and try to answer the first 20.”
Occasionally, it turns out that my intuition appears correct, in which case I flag that understanding as “provisionally true, but check these underlying assumptions FIRST at the first sign of trouble”. More often than not, though, I start noticing discrepancies between what my intuitive “understanding” was telling me, and what I’m actually seeing experientially.
About then my intuition starts saying “well, maybe we’re still right, and it’s just—”, at which point I tell it, “you had your chance, buddy, let’s go back and reexamine the details. If it turns out you WERE right and something else is going on, we’ll figure that out by the time we’re done.”
But for me (and, I suspect naively, for a lot of other intuitive people), jumping in and trying something the moment your intuition tells you that you’ve got it is a highly effective learning strategy, so long as you have someone who can tell you before you’re about to do something legitimately dangerous.
Steps would go something like this:
Recognize the ‘eureka!’ moment
Formulate an experiment
Visualize EXACTLY what you think will happen when you perform that experiment
Safety check, preferrably with a domain expert
Perform the experiment
Hold yourself accountable
Go back to the text and compare your intuition, your results, and the text
Repeat 2-8 until you’re internally “sure” your intuition is correct
Compare notes with a domain expert
A nice scientific approach!
In essence, yes; but the intended effect is more psychological.
A thing I have noticed about myself, is that once the intuitive “aha!” circuit activates, I simply cannot continue paying attention to details. My brain wants to gloss over any remaining information, saying “yeah yeah I GET it already!”
Jumping straight into the action satisfies my intuition’s need for novelty and immediate feedback.
Moreso, when it turns out my intuition was wrong, I feel genuine surprise—which snaps me back into a state where I’m ready to pay attention to details again!
So for me, it’s less about “doing science” as it is about providing my brain with the right “flow” to keep me motivated towards the goal of actually understanding a phenomenon.
This post makes a lot of claims that are factual in nature. Many of them seem to make sense, but that doesn’t mean they are true. In fact, some of them may be false; I recall seeing research showing that intuitive thinkers performed better at math / logic problems if they were word problems involving social settings, eg amount of soda to buy for a party or people sitting next to each other. Regardless of this specific claim, the general point is that an article full of factual claims should have citations. If citations are too much trouble, the writer should provide some evidence of expertise to give us a reason to believe factual claims without citations.
Frameworks and claims that make intuitive sense but are not linked to research are risky from an epistemic hygiene perspective. So I felt I had to downvote this post despite it being well written and reasonable sounding.
I first discovered these recurring tendencies in my self and in others. Then, used inferences from what’s scientifically known about intuition to explain how the nature of intuition might cause these tendencies in intuition thinkers.
I would explain this study’s result using the following inferential steps:
1) People (some more than others) have a lot of experience being in social situations
2) It’s not uncommon for people in social situations to face problems that can be easily formalized as math problems, e.g. how to split the bill at a restaurant or the examples mentioned in the study.
3) Intuition uses past experiences to solve problems
4) Intuition thinkers have probably trained their intuition to be able to solve problems found in social situations.
5) Intuition thinkers are more likely to be better at solving math/logic problems found in social situations than math/logic problems found in settings they don’t have much experience with, yet have enough background knowledge to solve.
In the post, I also inferred that intuition thinkers have a hard time corresponding words in math problems with formulas they know. When the words involved are words they’ve corresponded to formulas in the past, they’re more likely to make the right correspondence again.
If I read about the experiment before knowing the results, I wouldn’t be too surprised if intuition thinkers beat out logical thinkers.
--
This is a good example of how I explained the tendencies in the rest of the post. I think the step that demands the most evidence is (3), but felt like there was enough scientific backing for it that lesswrongers know about. I believe the other inferences don’t require additional evidence enough to the point that leaving them out greatly weakens the argument.
I may have definitely made inferences in the post that, without providing additional evidence, greatly weaken my argument. I’d appreciate any instance of this pointed out.
One such study is the famous Wason selection task, and there, evolutionary psychology gives a fundamentally very different sort of answer than what you’ve given: that we have evolved, innate cognitive modules that solve certain types of problems… but are not used at all when the same abstract form of problem is put in a different context:
The explanation on wikipedia is well worth a read.
After reading the article, it seems like their conclusion is still debated. I’m also not convicted, although I have updated that the general-purpose mechanism hypothesis is less likely correct. There needs to be an experiment with the context being non-social but frequently occurs in people’s lives. For instance, “if you arrived to the airport less than 30 minutes before your departure, you are not able to check in.” Then compare results with those from people who have never been on a plane before.
Edit: I realized my example can also be explained by the “cheater detector module”. In fact, any question with the conext being a human imposed rule can be explained the same way. A better question would be “if your car runs out of fuel, your car cannot be driven.”
I like the main essay, but have one quibble:
If it’s trivial to find these resources, could you not include them in the OP?
Reasons why you should:
1.) You spending the ten minutes it takes to do this will prevent dozens of readers from spending a collective several hours doing the same.
2.) The resources will receive wider use because the trivial inconvenience is gone.
3.) It’s more difficult for people who are not you to know exactly what you’re talking about or what you’re Googling for. Googling “framework for understanding people” is not very useful.
Agreed and added a link with a resource I found with a few minutes of googling.
I’d further add that linking to specific resources facilitates more discussion here about those resources, which we’re unlikely to do if we’re all googling different resources.
Interesting article, but do you have any empirical evidence that people’s thinking styles can be divided so neatly into intuitive vs. logical?
On its face, you seem to be taking this thinking style distinction for granted.
Reflecting on this some more, is an intuitive thinker synonymous with one who primarily uses System 1 style thinking and a logical thinker synonymous with one who primarily uses System 2 style thinking? If so, it’d clarify things quite a bit (for me at least) if you made that clear in your post.
Yes, those are synonymous. I should clarify that.
Hmm, are you sure that they’re synonymous? I initially assumed that your post was talking more about holistic vs. analytical reasoning (see e.g. pages 23-27 of The Weirdest People in the World), which seems to have some similarities with System 1/System 2 reasoning, but also differences which don’t map so clearly to it:
(E.g. this difference wouldn’t seem to be something that you’d expect to arise from just System 1/System 2 processing:)
Ah, I didn’t know about holistic/analytical reasoning before. With the intuition/logical thinking styles I had in mind, I wouldn’t have predicted that intuition thinkers would ignore situational over personality information. This may be a more cultural difference.
Right, it’s probably cultural—I wouldn’t assume it to be as prominent in Western holistic thinkers, either. Mostly I just brought it up to highlight the fact that the intuitive/holistic distinction may not map perfectly to the System 1/System distinction.
The reason for apparent anomalies is that “holistic” thinking can involve two different styles: pre-attentive thinking and far-mode thinking. That is, you can have cognition that could be described as holistic either by being unreflective (System 1) or by engaging in far-mode forms of reflection (System 2 offloads to System 1.) In Ulric Neisser’s terms, what is being called “intuitive” might reflect distinctly deeper or distinctly shallower processing than what is called analytic. I sort this out in The deeper solution to the mystery of moralism.
You needn’t buy my conclusions about morality to accept the analysis of modes as related to systems 1 and 2.
Citation Needed.
At a first read, several comments:
Your description of the biases of intuition-dominant thinkers neatly crystallizes several massive failure modes I’ve seen people fall into, many, many times, failure modes that have always made me incredibly frustrated and almost irrationally angry. (I am certainly what you would call a logic-dominant thinker.) I’ve almost always mostly perceived such people’s failures as just “oh god, this person is being horrifyingly stupid”; sometimes I’ve had an inkling of what precisely caused the stupidity (“not recognizing precise language” is a conclusion I’ve almost reached myself). Your explanation may enable me to better communicate with intuition-dominant thinkers in the future; thank you.
I am not convinced that it’s easy, or even really possible, to change from one thinking style to the other. Everything else I’ve read suggests this sort of cognitive leaning is largely innate. Do you have anything other than your own experience to suggest otherwise?
I am having some difficulty understand the “Ignoring your emotions” section, much less seeing the use of “fixing” this “failing”. (I may expand on this later, when I’ve had some sleep and reread it a couple of times.)
No one style is completely dominant in anyone, we use both depending on the situation. I don’t think I could become intuition-dominant, but I’ve certainly learned to use my intuition more over the years – I’m less inflexible now than I was, say, in high school. Part of that was the transition from learning school science to learning real science—having to actually look at information and come up with ideas. More recently a lot of it is in having to explain ideas to strangers on a regular basis – needing to develop a better intuitive sense of what they need and where they’re coming from.
I too think it’s uncommon to completely change thinking styles, but I do believe it’s possible to improve the weaker one. I also suspect one thinking style struggles more to develop the weaker thinking style, but don’t know which one.
Being around many people who are into self-development, I often see logical thinkers being more intuitive and vice versa. No one makes a complete 180, but incremental improvements are common.
The idea is that feeding emotional data to your intuition can help you better understand your own preferences, understand why you experience certain emotions, and how to illicit certain emotions in yourself and others. If you’re not an emotional person, this is probably not a big concern.
I agree with your suspicion, and hypothesize that it’s intuitive thinkers who have more trouble developing logic-based thinking styles.
I think the answer may change depending on age. Older intuition thinkers probably have deeper ingrained habits and less motivation.
After four years of nursing school, I changed from an INTJ to an INFJ on the Myers-Briggs. The medical field is somewhere where you’re constantly getting bombarded with data, some of it very relevant and some of it not, and you have to react fast. There’s value in being able to think logically and systematically through a patient’s symptoms to make sure you’re not missing something–but it’s too slow much of the time, and I’ve learned to at least notice my quick flash-intuitions. The feeling that “something is wrong even if I don’t know why” can be an incredibly valuable indication that you have to check something again, ask someone else to have a look for you, etc. Also, dealing with human beings in the most vulnerable moments of their lives is a great way to develop empathy.
It’s helped me a lot. Anna Salamon recently shared her technique of “when I have a mysterious annoying emotion that I don’t endorse, I ask it what it wants.” I may not endorse the emotion, but I feel it, and even if I try to ignore it, it’ll probably still impact my behaviour, i.e. by making me act less nicely towards a person who irritates me. But I frequently can figure out “what the emotion wants”–for example, it turns out that a large percentage of the time, when quotes from an article annoy me, it’s because I implicitly feel like they’re attacking me because they criticize or point fun at someone who I identify with.
Example: the movie “The Heat” was hilarious but left me with a bad taste in my mouth, and I was able to track down that it was because one of the main characters, a female cop who was characterized as very smart and capable but nerdy and socially unaware, was poked fun at a lot and eventually changed by becoming less nerdy and more like the other main character, a female cop who broke all the rules with a “git ’er done” attitude (who AFAICT didn’t change at all.) I felt more similar to the nerdy character, and part of me felt that the movie was making fun of nerds in general. I was able to convince myself that this wasn’t a reason to be cranky.
As an “INFJ” who has learned to think in an “INTJ” way through doing a maths degree and hanging out with INTJs, I also agree that different ways of problem solving can be learned. What I tend to find is that my intuitive way of thinking gets me a less accurate, faster answer, which is in keeping with what everyone else has suggested.
However, with my intuitive thinking, there is also an unusual quirk that although my strongly intuitive responses are fairly inaccurate (correct about half the time), this is much more accurate than they have any right to be given the precision of the ones that are correct. My intuitive thinking usually applies to people and their emotions, and I frequently get very specific hypotheses about the relationships between a set of people. Learning logical thinking has allowed me to first highlight hypotheses with intuition, then slowly go through and falsify the wrong ones, which leads me to an answer that I think I couldn’t possibly get with logic alone, since my intuition uses things like facial expressions and body languages and voice inflections to gather much more data than I could consciously.
This seems like a great reason to be cranky at the movie… this sort of thing would definitely make me angry at a movie, and I would absolutely endorse that response. Why would you not?
Re: having a “mysterious annoying emotion that [you] don’t endorse”: I’m not sure I understand this experience. Could you elaborate a bit?
Re: your comments on “reacting fast”, and the OP’s comments on “gut feelings”: Do we have any evidence that these “flash-intuitions”, these “gut” judgments, are actually accurate? How accurate are they?
Because almost every book or movie out there has something in it, somewhere, that would annoy me, and I like enjoying books and movies. Because the movie’s content is there anyway and me being cranky about it does absolutely nothing about the underlying culture that makes it normal for nerds to be made fun of in an action movie. Once I notice that I’m cranky and realize why, if I want to do something productive about it I can mention to my friends that I don’t think nerds should be characterized that way, or I can write a blog post about it...without being cranky. Emotions are a signal. Once I’ve decoded that signal, I don’t need the negative emotion anymore, unless it makes me more effective at doing something, which is rarely the case.
We live in a world that’s practically designed to play on our emotions. TV ads basically designed to make me insecure about my appearance/coolness/etc so I’ll want to buy their product? I don’t need to walk around with lingering insecurities; I thought I looked fine and was cool enough two minutes ago, and an ad by people whose job is to manipulate me shouldn’t change that. Lots more people, in more complicated social relationships, than the ancestral environment? Sometimes people do things that annoy me, but I know that the thing they did was harmless and I don’t think it should annoy me, and whereas if they were a family member I’d ask them not to do it for my sake, they’re actually just an acquaintance, and I don’t want to go around feeling annoyed. Or sometimes I make a faux pas and I apologize and take all the necessary steps to fix it–I don’t want to keep feeling guilty after this, but I often do. Sometimes I have to make a choice between two different things I want to do, and I pick the one I want more, but then I’m sad about the one I didn’t pick–I don’t want to keep feeling sad and enjoy the thing that I did pick less.
They’re based off data that I could in theory notice and analyze logically–the way someone’s skin feels, the way someone looks when they’re breathing, etc. It just doesn’t feel like logical analysis anymore, it feels like “oh crap, X.” I think that a lot of intuition could be logical reasoning, it’s just learned to the point that it happens fast at a subconscious level by pattern-recognition. And they don’t have to be anywhere close to 100% accurate to be helpful. Usually the first thing I do with a gut feeling is get more data. “This patient seems worse...okay, I’ll check all their vital signs again.” Sometimes I’m right and something has changed. Sometimes I’m wrong. I’d still rather have the gut feelings than not. I don’t know how much data there is on this.
Hm. Thank you for taking the time to explain; I definitely appreciate it. Your experience and values seem to differ from mine in a number of ways; that does seem to be what’s behind the OP’s advice being of different utility to us.
As for the bit about accuracy of gut feelings: I take your point about them being a good signal to investigate further. I do remain quite dubious about the use of the gut feelings directly, in place of explicit reasoning. I would very much like to see some data about this.
The advantage of gut feelings and intuition lie with their ability to synthesize years of experience and thousands of variables into one answer within less than a second.
When is this necessary?
During a conversation, someone watching your face is going to be observing how you react (even in the smallest possible ways) as they speak. You don’t have an hour, five minutes, or even two seconds to decide how to present yourself; they’re going to judge you based on that instantaneous reaction (or a lack of one, including a delayed reaction or straight face.)
Anyone who is a natural “people person”—the kinds of people who can get almost anything they want from anyone around them, who make great salesmen or politicians—is going to need to be able to continuously react “properly”, and that means intuitive judgments.
Same thing with any kind of games/sports, or literally any other situation where a quick reaction is required and not immediately responding will doom you no matter what.
The domain of “behaving in such a way as to gain and maintain an advantageous position in social interactions” is very different from other domains, like “diagnose and repair difficulties with computer equipment”, “diagnose and treat afflictions in human beings”, “understand mathematical concepts”, and almost anything else. It seems to me that the domain of social interactions with other humans is in fact a unique domain, not properly comparable (in the context of the current discussion) to anything else.
Certainly gut feelings are key in social interactions; in fact, the most charismatic, likable, and socially successful people do what they do largely unconsciously, and are almost entirely unable not only to explain their technique to others, but even to recognize there is a technique, that other people are not as skilled at using.
My question is, how accurate are gut feelings in other domains — especially those domains where there is are objectively right answers and wrong answers, and where it is possible and even easy, in principle, to compare the answer you get from your gut feeling to the actual right answer? In the treatment of computers and people, in math, in science, in engineering? (Answering this question requires data!)
What’s more: I really don’t think that this
is an accurate characterization of where gut feelings in successful social interactors come from. It’s been my experience that such instinctual social success is largely innate. Oh sure, it may be honed, but saying that the gut feeling is a synthesis of years of experience is just almost certainly not what’s going on there. More likely it’s a naturally great ability to model others, to respond (unconsciously) to nonverbal cues, etc.
How about sports and fast paced games?
Players are often required to make decisions with no time whatsoever to plan. For example, you might find yourself surrounded by enemies with no warning.
You need to know whether to run on foot, to teleport away, or to fight.
The difference between reacting in a third of a second and a fourth of a second could mean life or death.
Success in this situation, assuming it’s possible, is dependent on your experience in similar situations and your instinctual reaction. Since you do not have the time to think, your decision is almost guaranteed to be imperfect, but any improvement in it is highly beneficial.
Obviously, the same would often apply in war or in certain crisis situations.
You mention lots of fields (computers, math, science, engineering) where your argument is almost tautological: in a case where you have time to reconsider each decision, a slow but reliable and precise method is better than a snap judgment. Yes, I would agree with you, and I would also agree that logical thinking is better than intuitive thinking in many, many situations.
Are you suggesting that the ability to model others or respond to nonverbal cues is innate, rather than learned? I would definitely disagree, though proving it would be difficult. I suspect that it’s a matter of internalizing the results of numerous actions and reactions in different situations. In my experience, it’s often developed by people who travel lots or are otherwise exposed to tons of different people in a situation where being friendly and getting on their good side is very helpful. Some of them, pretty bad at socializing before they were in such a situation (and really gave it the necessary effort to learn).
I disagree, however, when you say that being socially successful is innate.
I do not play sports, but I did spend several years doing high-end raiding (mostly as a main tank) in World of Warcraft, which I think fits your criteria. Raid play is fast-paced and demanding, with necessary reaction times measured in fractions of a second.
I would not characterize good play in a WoW raid as based on intuition. Here is, basically, the process for beating a new, challenging raid boss:
Go in, try the boss. The entire raid dies horribly, of course.
Meticulously, exhaustively analyze the combat log. Note down all observations made of boss behavior. Correlate data.
Brainstorm solutions, based on raid leader’s and key raid members’ comprehensive, minutely detailed knowledge of game mechanics.
Make detailed plan. The plan implicitly includes generally correct play from all raid members; note that for almost all classes in WoW, optimal play means following detailed algorithms for ability usage, often worked out at length by top “theorycrafters”, who are often people with advanced degrees in physics and mathematics (no, I am not exaggerating) — plus, of course, extensive experience, to the point where playing correctly is at the level of muscle memory.
Attempt to execute plan. Correct execution demands precise, down-to-the-second performance from all raid members.
If successful: yay! If failure: proceed to step 2. Repeat until victory.
If this is an intuition-based approach, then I don’t know what “intuition” means.
Of course logical thinking is better when you have time to use it. I’m not asking whether it’s better. I’m asking whether “gut judgments” are accurate, and how accurate they are.
Basically, I see many people claiming that in “crunch time” scenarios, you have no choice but to apply the gut judgment. Ok. But my question is: if you later go back and apply logical reasoning to the (by now, perhaps, irrelevant) problem, does it turn out that your gut judgment was right? How right? How often? Etc.
On a PvE server, on in PvE in general—yes, raid bosses are basically a puzzle that you figure out and then execute to the best of your ability. But take a PvP server, say you’re assembling for the raid and are attacked. This is the fight where you have half a second to realize what someone is trying to do to you and come up with a counter.
I hesitate to say that you have to act on your intuition in a PvP fight, probably a better word is memorized (mostly subconsciously) patterns based on experience—that’s what drives your actions.
On a PvP server, if you’re assembling for a (serious) raid and are attacked, you sigh, say “goddamnit… jerks”, and then res as fast as possible in a way that will get you to the raid ASAP. (And that’s back when you couldn’t just teleport directly to the instance from wherever.)
“Memorized patterns based on experience” is a good characterization (often they’re even memorized consciously). Although, there is a nontrivial element of intuition in competitive (arena) PvP, where your opponent’s psychology is an important factor.
That rather depends on your guild. “Screw the raid, we’ve got faces to melt!” is not an uncommon response :-)
Dirty casuals :p
I have very little experience with WoW, so it’s interesting to hear how deliberate and reasoned a high-level raid is. I have a little experience with sports, combat, and combat sports.
It’s pretty surprising that our brains handle abstractions as well as they do. It’s not at all surprising that they can process and integrate sensory information as fast as they can, because that trait is crucial to survival for most animals.
When Kevin Durant fakes a pass and then shoots from 30 feet away, he’s doing something he’s done thousands of times before. It’s a pattern. But he’s adjusting that pattern for many things that weren’t present in practice, and no two shots are exactly alike. His brain is calculating a trajectory much faster than any of us could with pencil and paper, and his cerebellum is “answering” hundreds of individual questions about muscle opposition that our roboticists might not be able to coordinate at all. He misses some shots, of course. But insofar as a made shot counts for accuracy or right judgment, he probably has better accuracy in much less than a second than anybody could achieve with reflection.
Yes. This is exactly right, and true in WoW as well.
So, I realize this is off-topic, but I’m really curious: wouldn’t it be easier to automate steps 1, 5 and 6?
Some rudimentary efforts to do so have, on occasion, been made. While wholesale bots (i.e., no real-time human control at all) are totally incapable of performing at the level required to beat high-end raid bosses, certain simple, repetitive parts of the process can be automated with add-ons and macros.
There are two issues here: desirability and difficulty.
Desirability: if you automated those parts, then there wouldn’t be a game. No one wants to just theorycraft for a while and then sit there and watch while things happen automatically. Theorycraft is the metagame. The parts where you actually execute the plan are the gameplay. And the gameplay is fast-paced, exciting, adrenaline-rush-generating, skills-demanding, and cool-looking. The excitement of the gameplay is what WoW raiders live for.
Or at least, most WoW players take this stance. Knowing this, Blizzard has consistently banned any game add-ons that go too far in automating things. There’s a fine line, and sometimes it shifts, and sometimes it’s blurry, but the intent is clear: thou shalt play the game yourself, not write code that will play the game for you. (As with all commandments, precise interpretation is a longer discussion.)
Difficulty: The reason you can’t actually fully automate the steps in question (unless, perhaps, you are the game/boss designer, and have access to all the internal game variables) is largely because:
Positioning (i.e., location and movement of characters in the game world) matters a lot. (The reasons why are several, and probably boring, but take my word for this.)
Timing matters a lot. Which is to say, not only must character ability usage be timed correctly with respect to the behavior of game elements (monsters, environmental events, etc.), but players must also time their actions with respect to, and in response to, what other players are doing.
There are many variables that go into correct play. Combinatorial explosion would make automating this a daunting task. For a human, learning a boss strategy, or a play technique, is faster than devising and implementing an algorithm to execute it. To a human, you can just say “kite that mob over there, then release it when I yell on voice chat”, and (if he’s a skilled player) he won’t need to be told twice. Writing code to do this… is likely possible, but not easy.
Not exactly. Yeah, I know this isn’t WoW.
Yeah, my comments were WoW-specific. Roguelikes are very different.
Yes, I am not only suggesting but saying it explicitly (but see caveat below). Huge, obvious case in point: the autism spectrum. People on the spectrum (such as myself) have little to no ability to perceive nonverbal cues or (non-explicitly; again see caveat) model others.
Even for neurotypical (that is, non-autistic) people, there is a range of ability in this area.
Caveat to the above: I think these skills are innate in most people; that there is a range of ability, with the autism spectrum at one of that range and naturally charismatic, socially apt people on the other; but that the skills can be learned, with effort, as explicit skills.
For instance, autistic people can train themselves to recognize nonverbal cues and social signals; but this is not a matter of simply unconsciously perceiving the cues/signals/situations and just “knowing” their meaning, as it is for most people; rather it is a case of consciously paying attention and looking for things; and the meanings of these cues and signals must be looked up, researched, and memorized. In other words, a logic-based approach to compensate for lack of an intuitive ability.
It is probably also the case that neurotypicals who are not on the extreme positive end of the social ability spectrum, but do not lack the innate intuitive ability, can train their ability in the manner you mention. I would not know, of course, but it seems plausible enough, and consistent with what I’ve heard.
More on gut feelings:
When I was 13 years old, I was a heavily logic-dominant thinker, and I was terrible at reacting under pressure–I found this out when I started taking the required classes to become a lifeguard. I think this is mainly because, even though I could reason through what I was supposed to do, I was misinterpreting the nervousness of social pressure and people watching me perform as uncertainty about what to do. I also tended to be so occupied by thinking things through that I would have “tunnel vision”–my method wasn’t fast enough to flexibly adapt when I thought a situation was one thing and it turned out to be something different.
In first year nursing school, I had gut feelings, and they were screaming at me all the time. I ignored them–justifiably, because they were pretty useless. I didn’t yet have what they call “clinical judgement”, which AFAICT consists of your intuition knowing what details to work from. Four years ago I didn’t really know what it looked like when someone was having trouble breathing–now I could list probably 10 little details to look for. But the mental process isn’t a checklist down those ten items with yes or no for each and making an aggregate score–it’s “this person looks okay” or “crap, this person doesn’t look okay.” And this happens even if I’m not asking myself the question–I look at a patient and my brain pings me that something is wrong. I think the main limitation that my 13-year-old self had to work under was that I ignored my gut feelings, so I frequently didn’t notice new information that didn’t make sense–if it didn’t fit into the mental model I’d made of what was going on, it got filtered out. Intuition is good at noticing confusion. Logical thinking tries to suppress confusion by fitting details into a model even if they don’t fit very well, and it doesn’t answer questions that aren’t asked, either.
Moral of the story: it takes time and effort to train gut feelings. They don’t come from nowhere.
I take by this that you don’t have the experience of it feeling like your brain’s being hijacked into having an emotion that you don’t want?
I guess something that’s atypical about me for a LWer is that I’m very agreeable and somewhat of a conformist. I don’t like to bother other people. Acting on frustration or anger would often make me a bother to other people. Even when I’m in the right, I can fix the situation more effectively from a standpoint of not being angry. My angry self might say things that my later non-angry self would regret, and I’ve gotten pretty good at not doing that.
Very interesting about the training of gut feelings. A bit from my own experience:
I worked for a number of years in tech support positions, where I was often called upon to do PC maintenance/repairs/troubleshooting. After a while, I definitely developed an intuition about what might be wrong with a computer, given some set of symptoms, and often put that intuition to good use in the diagnosis/repair process.
However, one critical advanced skill I learned was not to trust that intuition too much. That is: a machine is brought in for repairs; symptoms are provided; I think “aha, it sounds like a motherboard problem”. Certainly, when going through diagnostic procedures, I should then be on the lookout for confirming evidence. But one of the most serious errors a technician might make in this situation is not being sufficiently thorough in checking the other possibilities for what might be wrong. Other problems might (perhaps more rarely) lead to the same symptoms; furthermore and even more insidiously, the provided symptoms might give no indication whatever about some other, largely unrelated problem.
Astute readers of Less Wrong may recognize such a failure as, in large part, good old confirmation bias.
Edit: And note that the bulwark against making such errors is to have a rigorous diagnostic procedure, follow it, and get into the habit of reasoning about the situation explicitly (bouncing ideas off another person helps with this). In other words: a logic-based approach.
It might (for some people?) be more useful to think of the “noticing confusion” feeling as a distinct thing, rather than simply calling it “intuition”. Certainly I, for one, experience it as a specific mental sensation, so to speak.
I’m not sure… that is to say, I’m not sure what you mean, exactly, so I’ll attempt to describe something I experience, and maybe you can tell me if it’s the same as what you’re talking about.
Sometimes (though more rarely, these days) I will have a certain sort of negative emotional response, which I would describe as anxiety; it generally comes with restlessness and inability to concentrate. Naturally, this is not an emotion I ever want to be having. I have identified several specific sorts of situations that trigger this.
I don’t know that I’d describe it as feeling like my brain’s being hijacked, but only because it seems logically questionable; “hijacked” implies there’s some agent that is the hijacker. I usually have a limited ability to suppress this feeling by analyzing what I think is causing it (usually it’s worry about certain specific sorts of outcomes/events), and attempting to reason about the likelihood of such an outcome, and what I can do to prevent it. So such an emotion is annoying, but not really mysterious (except insofar as I don’t actually know all the details of my own psychology, but then, all my emotions are mysterious in that broader sense).
I don’t know that I’ve ever had any other sort of emotion that I would say I didn’t want to be having. For example, usually when I feel anger, it’s in situations when I think that it is appropriate to feel anger. In the case of frustration, though, you might be right; frustration might be one emotion that’s dispensable once it’s served its role as a signal.
I have, on occasion, said things in anger that led to escalation of conflict, but I don’t think I’ve regretted saying them, since I was in the right, and felt that I was both correct and morally entitled to my comments.
When I said that we seemed to have some different values, I meant that I don’t think there’s anything inherently wrong with being angry, if the anger is justified.
Very true. Confirmation bias and not looking hard enough for a diagnosis is a big issue in medicine, too. I’m not sure if there’s a difference between health care practitioners who were originally logic-dominant thinkers or originally intuition-dominant thinkers, or whether both struggle and have to learn the other skill anyway.
A difference is that when you’re working, you have time to be as slow, thoughtful, and deliberate as you want when figuring out a problem. Obviously it’s better to reason things through as well as noticing intuitions, but System 2 (roughly, explicit reasoning) is slow and effortful and puts a heavy load on working memory, and System 1 (roughly, intuitions) is fast and doesn’t fill up working memory. My younger self wanted to reason though everything logically–and as a result, because nursing is a profession where you’re always working under time constraints, I was always a step behind everyone else, always took longer to get started at the beginning of the day, always stayed an hour past the end of a shift to finish charting. I don’t think this is because I’m a “slow” thinker–I finish written exams in half the time that it takes most of the other nursing students. Also, in my experience having a load on working memory increases confirmation bias–I don’t know if this has been studied, although it wouldn’t be a hard study to do. I’m more curious about things that don’t make sense now.
Modern medicine makes use of checklists a lot. I think this is awesome. I don’t need any urging to use them; I was making personal checklists on my work sheet way before I knew this was already a thing. And “if in doubt, ask someone else to come have a look” is pretty universal too. Also not something I need urging to do.
I don’t literally mean that. It’s just what it feels like.
Even when this is the case, I don’t find that anger helps me get what I want. Then again, being agreeable, a lot of what I want is “not to be in conflict anymore.” Also, I think some people kind of enjoy the powerful feeling that anger gives them. Whereas I find the feeling of anger aversive.
It seems we mostly agree about the usefulness and applicability of gut feelings, as well as their limitations. (Of course, if someone else is aware of any research about their accuracy, I am still interested in seeing it.)
One way I would summarize the ideal setup is: during “downtime”, use logic-based reasoning to come up with a rigorous and easy-to-apply procedure; during “crunch time”, use intuition to generate probable avenues of investigation and likely candidates for diagnosis and solution; supplement with the pre-developed procedure to guard against biases and ensure correct usage of intuition-derived data.
Does this sound like a fair summary?
Just curious, did you have any explicit beliefs that made you ignore your intuition?
I may have had an explicit belief that my own intuitions were wrong most of the time. I don’t think I had a belief that following intuitions period was bad; I always admired people who seemed to be able to do so and get good results.
I’ve noticed a lot that can be explained just by practice effects. Someone practices tai chi for 5 years, they think of movements in terms of tai chi. Someone practices basketball for 5 years, they think of moving a ball in terms of basketball. Someone practices intuition, and wordplay, and they think of problems in terms of those.
Basically, I think it’s like expertise (random google hit that seems to cite the research I was thinking of). Experts build a framework for understanding things. If someone has been immersed in fashion their whole life, they have a whole mental vocabulary for clothes that I am only starting to realize exists. I, as someone who has been immersed in problem-solving my whole life, have a vocabulary for doing it that I take for granted. But if someone who is bad at problem-solving practices logic, or I practice fashion, we can learn the structures experts use, just like people can learn a musical instrument without having been born knowing how to play.
From this perspective, the flaws listed for “intuitive” reasoners stem from a lack of practice / expertise at logical tasks, while the flaws listed for “logical” reasoners stem from a lack of practice / expertise at other kinds of tasks (like, apparently, the “picking up on Russian social norms” task). The reason the division in the OP makes sense is because becoming an expert takes a lot of time, so not everyone will be an expert at logical tasks if it’s not absolutely required, and people who become experts at logical tasks will be less likely to be experts at other things, due to time constraints.
But this is not to say that the division is always a good heuristic—it’s possible for people to be good at two tasks that would normally be put in different camps. And conversely, it’s possible for people to be good at neither of two tasks. That is to say, stupid people exist. The only reason (or so I claim) that we’re grouping professional writers together with stupid people in the “intuition” group is because they’re both non-experts at certain logical tasks—not because stupid people and professional writers form a natural kind.
Intuition thinkers probably wouldn’t have the foresight to learn Russian norms. However, they wouldn’t make a strict rule like “always smile”. Even if they did normally smile, in Russia, their intuition would be thrown off and would probably execute a more optimal strategy. Without a strict rule, they’d also be more attuned to the immediate environment and intuit that smiling isn’t customary.
Firstly, this post is awesome.
Secondly though, this post brushes on the topic of intuition as a useful tool, something I think far too many Logic-Based types throw out without considering the practicality of. It’s better not to think of it as being an substitute for logical thinking, but rather as a quick and dirty backup, for when you don’t have all the information.
Intuition can occur in up to two seconds, operates almost completely below conscious awareness, and begins effecting your body immediately. Here are some excerpts from Blink, a book by Malcolm Gladwell, in which he researches how intuition works, what abilities and drawbacks it has, and what biases can effect it’s overall usefulness.
Ah, a perfect opportunity to be a Logical Thinker, using careful observation and reasoning to find the ideal pattern. What path does intuition take though?
This is all standard enough, but what is more impressive is the fact that people started generating stress responses to the red decks by the tenth card.
That’s right, palms began to sweat in reaction to the red decks almost immediately, naturally pushing people towards the blue decks before they could even understand why, or even recognize what they were doing.
There are better examples of applied Intuition in Blink, but I’ve purposefully only used one of the earlier examples in the Amazon Sampler to respect the book. I’d recommend reading the whole thing though, especially if you’re interested in understanding what it does while you’re thinking things through.
The problem with the stress response is that it’s likely based only on the potential for loss rather than any real intuitive calculation. Suppose blue was (+10 / −5) at 50% each, and red was (+300 / −250 ) with 50% each. Red is right but I would very much expect a stress reaction on red from ordinary people, more so for cards with a wider variety of possible outcomes!
How does all of this interact with the fact that almost everyone will continue to take some number of cards from all decks the entire time, rather than going for exploration early and then exploitation late?
Can I ask that the title be changed to “Biases of Intuitive and Logical Thinkers”? I almost didn’t read this due to the very generic title.
Done
Sorry, this essay doesn’t make sense to me. I don’t understand the framework underlying it, the context in which it lives. It just looks to me like a mish-mash of generic life advice along the lines of pay attention! (“overlooking crucial details”) or listen to yourself! (“ignoring their emoitions”).
This is quite related to ignoring information that doesn’t fit into a framework, but another common logical bias is forcing information into your framework when it doesn’t fit.
My most obvious personal encounter with this was in my high school English classes, where my teacher frequently criticized me for having an “overbearing” interpretation of the text. For a perhaps more relatable example, I’ve known people who have just learned about status to interpret absolutely every behavior in the context of status, even when that doesn’t quite fit.
Are you saying it’s you who was forcing non-fitting information into a framework here, or your teacher?
I’m heavily intuition-dominant, in that I tend to minimize the use of “System 2” thinking whenever possible and make decisions based primarily on emotion. Some more patterns I’ve noticed:
Strategy. System 2/Logic-dominant thinking is much better for planning things out, especially when you’re working with a novel situation. If you use System 1 when playing a game such as Settlers of Catan for the first time you’ll have a very low chance of winning. If you use System 2 you’ll generally perform better (at least in the first play.)
Decision speed. Intuitive thinkers tend to make decisions quickly, and to (unconsciously) assign a high cost to “opening” a made decision. Logical thinkers tend to make decisions more slowly (but often more correctly) and be much more willing to debate, consider alternatives, etc.
Responsiveness to external feedback. Intuitive thinkers learn primarily by trying things and seeing what works, since most of their skills are on the 5 second level. This means fast learning with certain types of skills, but also a tendency to fail at things that don’t provide frequent feedback.
(Un)-awareness of their skills. Intuitive thinkers tend to have a harder time explicitly describing their thought process, or how they developed a skill. Someone who knows came up with a plan to learn Probability and executed it effectively can probably give you directions on how to study, but someone who knows how to charm a room will have a difficult time articulating exactly what they do.
Severe struggle with unusual tasks/skills. Heavily intuitive people who do most of their communication online are often literally afraid of phones, to the point where they’ll procrastinate on making calls for hours. Same with networking, going to office hours, etc. Some practice making phone calls tends to mitigate this problem pretty quickly, but it means intuitive thinkers often have a very steep learning curve when it comes to new skills. Conversely logic-dominated thinkers seem to have an easier time working outside their comfort zone.
None of these are insurmountable. For example, I’ve focused on improving my communication skills over the past few years and it’s shown tremendous results, to the point where people explicitly ask me to give talks or explain certain topics. But they do seem to be fairly strong tendencies.
This is a feature of people with an intuitive thinking style? Really? The description applies to me near-perfectly (I am, perhaps, “afraid of phones” more figuratively than literally), and I’d certainly describe myself as an almost totally logic-based thinker.
Something is wrong here. Either this is not, in fact, a feature of intuitive people, or the intuitive/logic-based framework is flawed.
Interesting. Thanks for the data point.
As a general pattern, do you feel like you have a harder time going outside your comfort zone than most people?
Yep. (Very relevant note: I am on the autism spectrum.)
Good to know. Generally speaking, I would expect degrees of autism to have more predictive power than intuitive vs. logical, especially in that case. Controlling for autism though, I would say that on average, logical thinkers I’ve met tend to be more willing to recognize that the actions that lead to their goals are outside their comfort zone (and take those actions), while intuitive thinkers I’ve seen tend to be less inclined to do things that don’t feel good (emotionally.) It ties in with the ignoring emotions point in the original post-intuitive thinkers are better at using their emotions productively, but tend to be ruled by their emotions even when it’s not useful (and this is definitely a problem for me.)
Good observations.
As an intuition-dominant thinker, how did you improve your logical side?
Rather than improve my logical side I’ve mostly come up with mechanisms that let me avoid System 2 thinking.
For example, to improve my overall communication skills I practiced writing on forums with an upvote mechanism. This let me get feedback pretty rapidly, and over time my communication skills improved significantly-even when it came to explaining how I had developed System 1 skills.
For strategy, I mostly rely on conversations with my friends. I can get myself to strategize if I sit down and concentrate but it’s very tiring, whereas when I talk to smart people they are usually able to quickly see holes in my long-term plans and point them out to me.
For forcing myself to actually use System 2 around 5% of the time...I know I have shifted from thinking consciously <1% of the time, to being able to use System 2 on command (though it is still very tiring). But I’m not really clear what enabled the change. One possible explanation: throughout the last 5 years I have been part of several board game meetups where I would have the opportunity to play a game just once or just a few times. So I was forced to think consciously if I wanted to have any chance of winning.
I think your example is bad. It’s the first commenter who is confused, not the second one.
The correct formulation is “If, for any small positive number (epsilon), I can show that the difference between A and B is less than epsilon, then I have shown A and B are equal.”
The first commenter screwed things up by saying “If, for any small positive number you give me (epsilon), I can show that the difference between A and B is less than epsilon, then I have shown A and B are equal.” The second commenter’s objection is valid.
Agreed. I should take it out.
I’m not sure if your first example “Ignoring information they cannot immediately fit into a framework” includes “sticking to an elegant, logical framework and considering cases where this does not occur to be exceptions or aberrations even when they are very common”.
That’s something you see quite a lot with some otherwise quite rational people: the ‘if my system can’t explain it, the world’s wrong’ attitude’. As illustrated here: http://xkcd.com/1112/
You mean “sights”, yes?
And what would you look at, if not sites? ;)
Dredging this from a deeply buried comment:
Come to think of it, I don’t know what “intuition” means. Is it anything but a label stuck on processes inaccessible to consciousness that produce thoughts? Like “free will” is a label stuck on processes inacccessible to consciousness that produce decisions?
Probably. I don’t think “intuition” is one process–for example, for neurotypical people apparently reading faces is innate and unlearned, and there isn’t and never was conscious processing that later became unconscious habit. I’m pretty sure that other intuitions start out as conscious reasoning and simply become overlearned to the point that the reasoning happens really, really fast and doesn’t feel like “thinking about” anymore–either that, or the “intuitions” were originally a separate process that was useless, and studying X fed them with information to the point that they became useful. Either way, not innate.
I do think it’s a useful word to have, even if it’s not rigorous. At the very least it’s shorter than “processes inaccessible to consciousness that produce thoughts.”
It’s useful as what Edward de Bono (who is too rarely mentioned on LessWrong) calls a “porridge word”: a name given to a vague concept just in order to have a name to call something by when we know little about what it is. Like porridge, it has no flavour of its own, and can take on any shape without resistance. Or to drop the metaphor, the word says nothing about the thing it vaguely points to, and can come to mean whatever subsequent evidence tells us about it. But a porridge word is never an explanation: any definition of a porridge word should include somewhere the words “we don’t know”.
ETA: 29 hits for “de Bono” in the LW search box, so not as unmentioned as I had thought.
Is it the same thing which Marvin Minsky calls a “suitcase word”?
See http://edge.org/conversation/consciousness-is-a-big-suitcase : “Most words we use to describe our minds (like “consciousness”, “learning”, or “memory”) are suitcase-like jumbles of different ideas. … those suitcase-words (like intuition or consciousness) that all of us use to encapsulate our jumbled ideas about our minds. We use those words as suitcases in which to contain all sorts of mysteries that we can’t yet explain.”
Something like, although de Bono sees them more positively as tools for thought, that let you talk about something when you don’t know what it is, and avoid premature commitment to explanations. Minsky is talking about what happens when they are mistaken for explanations.
One type of intuition that everyone has and can understand the feeling of is linguistic intuition. Natural language has very subtle rules that native speakers follow effortlessly, but those speakers would find it very difficult to consciously articulate how those rules work.
Yes, linguistic intuitions are an example of thoughts arising by a process inaccessible to consciousness.
The problem I have with the concept of “intuition” is that it’s a non-apple sort of thing. It means, “I don’t know how I know this”, but has no implications for what the mechanisms really are. So I don’t see a natural division between “intuitive” and “logical” thinking. Stuff that you’re aware of and stuff that you aren’t are both going on all the time. The boundary between them can itself change. Even for the stuff that you are aware of thinking, you aren’t aware of the mechanism underlying that. However wide the circle illuminated by awareness, it always has a boundary, beyond which is non-awareness. Is there a good reason to suppose that essentially different mechanisms are in play inside and outside that circle?
I mostly agree with this article but object to your use of the word “intuition”. What you’re calling “intuition” is closer to “feeling” in the Myers-Brigs sense than to intuition (in either the MB or the common sense of that word).
In particular logical people can also be very intuitive, they’re just have intuitions about, say the distribution of primes, rather than about other people.
I’m not sure the first example is really an error on the part of the commenter, unless there was an implicit shared technical usage at play. The word ‘any’ in the quote you give below is not very clear. I knew what it meant, but only because I understood what the argument was getting at.
“If, for any small positive number you give me (epsilon), I can show that the difference between A and B is less than epsilon, then I have shown A and B are equal.”
In this case, ‘any’ means ‘if whatever number is given, the following analysis applies, the conclusion is reached’
Compare:
“If, for any small positive number you can give me (epsilon), I can show that the difference between A and B is greater than epsilon, then I have shown A and B are not equal”.
Here, the natural reading is ‘if a single case is found where the following analysis applies, the conclusion is reached’
As I said, this may be a failing of technical language on my part, but I don’t think normal English is clear here.
I think you’re right. I was using prior knowledge to interpret the argument correctly. The ambiguity in the language definitely makes my example weaker. I tried empathizing with the commenter as an intuition thinker to try figuring out what the most likely mistake caused the confusion. I still think the commenter most likely didn’t pay attention to those words, but it’s also quite likely he understood the technically correct alternative interpretation.
In his situation, I’d probably read ‘any’ in the second sense simply because as a non-mathematician I can imagine the second sense being a practical test: (I give you a number, you show me that the difference between A and B is smaller, we reach a conclusion) whereas the first seems esoteric (you test every conceivable small number...)
On the other hand, the first reading is so blatantly wrong, the commenter really should have stepped back and thought ‘could this sentence be saying something that wasn’t obviously incorrect?’ Principle of charity and all that.
I believe it’s more of a spectrum.
That said, I think people should drop the notion that humans are rational. We’re boundedly rational, and this is balanced with logical reasoning.
It’s often said in pop culture/society that being rational is somehow “better” than being emotional. I used to believe this long ago, but now I think that’s bull. Emotions exist for a perfectly valid purpose, as a guide to our environment and how to interact with and control it. The fact is many humans make decisions or process information on solely emotive rather than rational pretexts. As an example, two queues were open at the supermarket the other day. The first had a really obese woman serving but was far shorter. The second had a cute Indian woman serving, but was far longer. I took the longer queue, just to say hello/making chit-chat/flirt with the cute woman. To some this is “irrational”, but to me it’s emotive and instinctual. And generally this is how humans often behave in the real world, and there is nothing wrong in that.
If chatting with cute women is utilitous for you, your decision was rational. Rationality doesn’t mean you have to restrict yourself to “official” payoffs.
Hi, this is a very interesting article and I have enjoyed reading many of the articles and posts contained on the website, keep up the good work and hope to read some more interesting content in the future. I got a lot of useful and significant information. Thank you so much. http://vienne.co/bop-nu