I reflexively tried to reverse the advice, and found it surprisingly hard to think of situations where applying higher level intuition would be better.
There’s an excerpt by chess GM Michael Tal:
We reached a very complicated position where I was intending to sacrifice a knight. The sacrifice was not obvious; there was a large number of possible variations; but when I began to study hard and work through them, I found to my horror that nothing would come of it. Ideas piled up one after another. I would transport a subtle reply by my opponent, which worked in one case, to another situation where it would naturally prove to be quite useless. As a result my head became filled with a completely chaotic pile of all sorts of moves, and the infamous “tree of variations”, from which the chess trainers recommend that you cut off the small branches, in this case spread with unbelievable rapidity.
And then suddenly, for some reason, I remembered the classic couplet by Korney Ivanovic Chukovsky: “Oh, what a difficult job it was. To drag out of the marsh the hippopotamus”. I don’t know from what associations the hippopotamus got into the chess board, but although the spectators were convinced that I was continuing to study the position, I, despite my humanitarian education, was trying at this time to work out: just how WOULD you drag a hippopotamus out of the marsh ? I remember how jacks figured in my thoughts, as well as levers, helicopters, and even a rope ladder. After a lengthy consideration I admitted defeat as an engineer, and thought spitefully to myself: “Well, just let it drown!” And suddenly the hippopotamus disappeared. Went right off the chessboard just as he had come on … of his own accord!
And straightaway the position did not appear to be so complicated. Now I somehow realized that it was not possible to calculate all the variations, and that the knight sacrifice was, by its very nature, purely intuitive. And since it promised an interesting game, I could not refrain from making it.
But this is a somewhat contrived example since this is reminiscent of the pre-rigor, rigor, and post-rigor phases of Mathematics (or more generally, in mastering any skill). And one could argue chess GMs have so thoroughly mastered the lower levels that they can afford to skip them without making catastrophic errors.
Another example that comes to mind is Marc Andreessen in the introduction to Breaking Smart:
In 2007, right before the first iPhone launched, I asked Steve Jobs the obvious question: The design of the iPhone was based on discarding every physical interface element except for a touchscreen. Would users be willing to give up the then-dominant physical keypads for a soft keyboard?
His answer was brusque: “They’ll learn.”
It seems quite clear that Jobs wasn’t applying intuition at the lowest level here. And it seems like the end result could have ended up worse off if he ended up applying intuition at lower levels. He even explicitly says:
You can’t connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something—your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.
I find neither examples I came up with convincing. But are there circumstances where applying intuition at lower levels is a strategic mistake?
I find neither examples I came up with convincing. But are there circumstances where applying intuition at lower levels is a strategic mistake?
Applying intuition at lower levels is a strategic mistake when you are substantially more certain that your high-level intuition is well-honed, than you are of your ability to explicitly decompose the high level into lower-level components.
(It can also be a strategic mistake for computational-cost reasons, as I outline in my other comment.)
Also, why is the Steve Jobs example unconvincing? It seems, in fact, an example of the sort of thing I am talking about.
Here’s something that Bruce Tognazzini (HCI expert and author of the famous Apple Human Interface Guidelines) said about Steve Jobs:
Steve Jobs was also one of the greatest human-computer interaction designers of all time, though he would have adamantly denied it. (That’s one of Apple’s problems today. They lost the only HCI designer with any power in the entire company the day Steve died, and they don’t even know it.)
Had you asked Steve Jobs to break down his intuitions into lower-level components, and then evaluate those, he may well have failed. And yet he made incredible, groundbreaking, visionary products, again and again and again. He had good reason to be confident in his high-level intuitions. Why would he want to discard those, and attempt a lower-level analysis?
I had worded it somewhat poorly, I wasn’t intending to say that Steve Jobs should have attempted a lower level analysis in technology design.
I just found it unconvincing in the sense that I couldn’t think of an example where applying lower level intuitions was a strategic mistake for me in particular. As you mention in your other comment, I am not substantially more certain that my high-level intuition is well-honed in any particular discipline.
More generally, Steve Jobs’ consistently applied high-level intuition to big life decisions too ― as evidenced by his commencement speech. It on the whole worked out for him I guess, but he also did try to cure his cancer with alternative medicine which he later regretted.
I completely agree with your computational tradeoff comment though.
I reflexively tried to reverse the advice, and found it surprisingly hard to think of situations where applying higher level intuition would be better.
There’s an excerpt by chess GM Michael Tal:
But this is a somewhat contrived example since this is reminiscent of the pre-rigor, rigor, and post-rigor phases of Mathematics (or more generally, in mastering any skill). And one could argue chess GMs have so thoroughly mastered the lower levels that they can afford to skip them without making catastrophic errors.
Another example that comes to mind is Marc Andreessen in the introduction to Breaking Smart:
It seems quite clear that Jobs wasn’t applying intuition at the lowest level here. And it seems like the end result could have ended up worse off if he ended up applying intuition at lower levels. He even explicitly says:
I find neither examples I came up with convincing. But are there circumstances where applying intuition at lower levels is a strategic mistake?
Applying intuition at lower levels is a strategic mistake when you are substantially more certain that your high-level intuition is well-honed, than you are of your ability to explicitly decompose the high level into lower-level components.
(It can also be a strategic mistake for computational-cost reasons, as I outline in my other comment.)
Also, why is the Steve Jobs example unconvincing? It seems, in fact, an example of the sort of thing I am talking about.
Here’s something that Bruce Tognazzini (HCI expert and author of the famous Apple Human Interface Guidelines) said about Steve Jobs:
Had you asked Steve Jobs to break down his intuitions into lower-level components, and then evaluate those, he may well have failed. And yet he made incredible, groundbreaking, visionary products, again and again and again. He had good reason to be confident in his high-level intuitions. Why would he want to discard those, and attempt a lower-level analysis?
I had worded it somewhat poorly, I wasn’t intending to say that Steve Jobs should have attempted a lower level analysis in technology design.
I just found it unconvincing in the sense that I couldn’t think of an example where applying lower level intuitions was a strategic mistake for me in particular. As you mention in your other comment, I am not substantially more certain that my high-level intuition is well-honed in any particular discipline.
More generally, Steve Jobs’ consistently applied high-level intuition to big life decisions too ― as evidenced by his commencement speech. It on the whole worked out for him I guess, but he also did try to cure his cancer with alternative medicine which he later regretted.
I completely agree with your computational tradeoff comment though.