Very interesting about the training of gut feelings. A bit from my own experience:
I worked for a number of years in tech support positions, where I was often called upon to do PC maintenance/repairs/troubleshooting. After a while, I definitely developed an intuition about what might be wrong with a computer, given some set of symptoms, and often put that intuition to good use in the diagnosis/repair process.
However, one critical advanced skill I learned was not to trust that intuition too much. That is: a machine is brought in for repairs; symptoms are provided; I think “aha, it sounds like a motherboard problem”. Certainly, when going through diagnostic procedures, I should then be on the lookout for confirming evidence. But one of the most serious errors a technician might make in this situation is not being sufficiently thorough in checking the other possibilities for what might be wrong. Other problems might (perhaps more rarely) lead to the same symptoms; furthermore and even more insidiously, the provided symptoms might give no indication whatever about some other, largely unrelated problem.
Astute readers of Less Wrong may recognize such a failure as, in large part, good old confirmation bias.
Edit: And note that the bulwark against making such errors is to have a rigorous diagnostic procedure, follow it, and get into the habit of reasoning about the situation explicitly (bouncing ideas off another person helps with this). In other words: a logic-based approach.
Intuition is good at noticing confusion.
It might (for some people?) be more useful to think of the “noticing confusion” feeling as a distinct thing, rather than simply calling it “intuition”. Certainly I, for one, experience it as a specific mental sensation, so to speak.
I take by this that you don’t have the experience of it feeling like your brain’s being hijacked into having an emotion that you don’t want?
I’m not sure… that is to say, I’m not sure what you mean, exactly, so I’ll attempt to describe something I experience, and maybe you can tell me if it’s the same as what you’re talking about.
Sometimes (though more rarely, these days) I will have a certain sort of negative emotional response, which I would describe as anxiety; it generally comes with restlessness and inability to concentrate. Naturally, this is not an emotion I ever want to be having. I have identified several specific sorts of situations that trigger this.
I don’t know that I’d describe it as feeling like my brain’s being hijacked, but only because it seems logically questionable; “hijacked” implies there’s some agent that is the hijacker. I usually have a limited ability to suppress this feeling by analyzing what I think is causing it (usually it’s worry about certain specific sorts of outcomes/events), and attempting to reason about the likelihood of such an outcome, and what I can do to prevent it. So such an emotion is annoying, but not really mysterious (except insofar as I don’t actually know all the details of my own psychology, but then, all my emotions are mysterious in that broader sense).
I don’t know that I’ve ever had any other sort of emotion that I would say I didn’t want to be having. For example, usually when I feel anger, it’s in situations when I think that it is appropriate to feel anger. In the case of frustration, though, you might be right; frustration might be one emotion that’s dispensable once it’s served its role as a signal.
Even when I’m in the right, I can fix the situation more effectively from a standpoint of not being angry. My angry self might say things that my later non-angry self would regret, and I’ve gotten pretty good at not doing that.
I have, on occasion, said things in anger that led to escalation of conflict, but I don’t think I’ve regretted saying them, since I was in the right, and felt that I was both correct and morally entitled to my comments.
When I said that we seemed to have some different values, I meant that I don’t think there’s anything inherently wrong with being angry, if the anger is justified.
However, one critical advanced skill I learned was not to trust that intuition too much. That is: a machine is brought in for repairs; symptoms are provided; I think “aha, it sounds like a motherboard problem”. Certainly, when going through diagnostic procedures, I should then be on the lookout for confirming evidence. But one of the most serious errors a technician might make in this situation is not being sufficiently thorough in checking the other possibilities for what might be wrong.
Very true. Confirmation bias and not looking hard enough for a diagnosis is a big issue in medicine, too. I’m not sure if there’s a difference between health care practitioners who were originally logic-dominant thinkers or originally intuition-dominant thinkers, or whether both struggle and have to learn the other skill anyway.
A difference is that when you’re working, you have time to be as slow, thoughtful, and deliberate as you want when figuring out a problem. Obviously it’s better to reason things through as well as noticing intuitions, but System 2 (roughly, explicit reasoning) is slow and effortful and puts a heavy load on working memory, and System 1 (roughly, intuitions) is fast and doesn’t fill up working memory. My younger self wanted to reason though everything logically–and as a result, because nursing is a profession where you’re always working under time constraints, I was always a step behind everyone else, always took longer to get started at the beginning of the day, always stayed an hour past the end of a shift to finish charting. I don’t think this is because I’m a “slow” thinker–I finish written exams in half the time that it takes most of the other nursing students. Also, in my experience having a load on working memory increases confirmation bias–I don’t know if this has been studied, although it wouldn’t be a hard study to do. I’m more curious about things that don’t make sense now.
And note that the bulwark against making such errors is to have a rigorous diagnostic procedure, follow it, and get into the habit of reasoning about the situation explicitly (bouncing ideas off another person helps with this). In other words: a logic-based approach.
Modern medicine makes use of checklists a lot. I think this is awesome. I don’t need any urging to use them; I was making personal checklists on my work sheet way before I knew this was already a thing. And “if in doubt, ask someone else to come have a look” is pretty universal too. Also not something I need urging to do.
I don’t know that I’d describe it as feeling like my brain’s being hijacked, but only because it seems logically questionable; “hijacked” implies there’s some agent that is the hijacker.
I don’t literally mean that. It’s just what it feels like.
I have, on occasion, said things in anger that led to escalation of conflict, but I don’t think I’ve regretted saying them, since I was in the right, and felt that I was both correct and morally entitled to my comments.
Even when this is the case, I don’t find that anger helps me get what I want. Then again, being agreeable, a lot of what I want is “not to be in conflict anymore.” Also, I think some people kind of enjoy the powerful feeling that anger gives them. Whereas I find the feeling of anger aversive.
It seems we mostly agree about the usefulness and applicability of gut feelings, as well as their limitations. (Of course, if someone else is aware of any research about their accuracy, I am still interested in seeing it.)
One way I would summarize the ideal setup is: during “downtime”, use logic-based reasoning to come up with a rigorous and easy-to-apply procedure; during “crunch time”, use intuition to generate probable avenues of investigation and likely candidates for diagnosis and solution; supplement with the pre-developed procedure to guard against biases and ensure correct usage of intuition-derived data.
Very interesting about the training of gut feelings. A bit from my own experience:
I worked for a number of years in tech support positions, where I was often called upon to do PC maintenance/repairs/troubleshooting. After a while, I definitely developed an intuition about what might be wrong with a computer, given some set of symptoms, and often put that intuition to good use in the diagnosis/repair process.
However, one critical advanced skill I learned was not to trust that intuition too much. That is: a machine is brought in for repairs; symptoms are provided; I think “aha, it sounds like a motherboard problem”. Certainly, when going through diagnostic procedures, I should then be on the lookout for confirming evidence. But one of the most serious errors a technician might make in this situation is not being sufficiently thorough in checking the other possibilities for what might be wrong. Other problems might (perhaps more rarely) lead to the same symptoms; furthermore and even more insidiously, the provided symptoms might give no indication whatever about some other, largely unrelated problem.
Astute readers of Less Wrong may recognize such a failure as, in large part, good old confirmation bias.
Edit: And note that the bulwark against making such errors is to have a rigorous diagnostic procedure, follow it, and get into the habit of reasoning about the situation explicitly (bouncing ideas off another person helps with this). In other words: a logic-based approach.
It might (for some people?) be more useful to think of the “noticing confusion” feeling as a distinct thing, rather than simply calling it “intuition”. Certainly I, for one, experience it as a specific mental sensation, so to speak.
I’m not sure… that is to say, I’m not sure what you mean, exactly, so I’ll attempt to describe something I experience, and maybe you can tell me if it’s the same as what you’re talking about.
Sometimes (though more rarely, these days) I will have a certain sort of negative emotional response, which I would describe as anxiety; it generally comes with restlessness and inability to concentrate. Naturally, this is not an emotion I ever want to be having. I have identified several specific sorts of situations that trigger this.
I don’t know that I’d describe it as feeling like my brain’s being hijacked, but only because it seems logically questionable; “hijacked” implies there’s some agent that is the hijacker. I usually have a limited ability to suppress this feeling by analyzing what I think is causing it (usually it’s worry about certain specific sorts of outcomes/events), and attempting to reason about the likelihood of such an outcome, and what I can do to prevent it. So such an emotion is annoying, but not really mysterious (except insofar as I don’t actually know all the details of my own psychology, but then, all my emotions are mysterious in that broader sense).
I don’t know that I’ve ever had any other sort of emotion that I would say I didn’t want to be having. For example, usually when I feel anger, it’s in situations when I think that it is appropriate to feel anger. In the case of frustration, though, you might be right; frustration might be one emotion that’s dispensable once it’s served its role as a signal.
I have, on occasion, said things in anger that led to escalation of conflict, but I don’t think I’ve regretted saying them, since I was in the right, and felt that I was both correct and morally entitled to my comments.
When I said that we seemed to have some different values, I meant that I don’t think there’s anything inherently wrong with being angry, if the anger is justified.
Very true. Confirmation bias and not looking hard enough for a diagnosis is a big issue in medicine, too. I’m not sure if there’s a difference between health care practitioners who were originally logic-dominant thinkers or originally intuition-dominant thinkers, or whether both struggle and have to learn the other skill anyway.
A difference is that when you’re working, you have time to be as slow, thoughtful, and deliberate as you want when figuring out a problem. Obviously it’s better to reason things through as well as noticing intuitions, but System 2 (roughly, explicit reasoning) is slow and effortful and puts a heavy load on working memory, and System 1 (roughly, intuitions) is fast and doesn’t fill up working memory. My younger self wanted to reason though everything logically–and as a result, because nursing is a profession where you’re always working under time constraints, I was always a step behind everyone else, always took longer to get started at the beginning of the day, always stayed an hour past the end of a shift to finish charting. I don’t think this is because I’m a “slow” thinker–I finish written exams in half the time that it takes most of the other nursing students. Also, in my experience having a load on working memory increases confirmation bias–I don’t know if this has been studied, although it wouldn’t be a hard study to do. I’m more curious about things that don’t make sense now.
Modern medicine makes use of checklists a lot. I think this is awesome. I don’t need any urging to use them; I was making personal checklists on my work sheet way before I knew this was already a thing. And “if in doubt, ask someone else to come have a look” is pretty universal too. Also not something I need urging to do.
I don’t literally mean that. It’s just what it feels like.
Even when this is the case, I don’t find that anger helps me get what I want. Then again, being agreeable, a lot of what I want is “not to be in conflict anymore.” Also, I think some people kind of enjoy the powerful feeling that anger gives them. Whereas I find the feeling of anger aversive.
It seems we mostly agree about the usefulness and applicability of gut feelings, as well as their limitations. (Of course, if someone else is aware of any research about their accuracy, I am still interested in seeing it.)
One way I would summarize the ideal setup is: during “downtime”, use logic-based reasoning to come up with a rigorous and easy-to-apply procedure; during “crunch time”, use intuition to generate probable avenues of investigation and likely candidates for diagnosis and solution; supplement with the pre-developed procedure to guard against biases and ensure correct usage of intuition-derived data.
Does this sound like a fair summary?