The title is ‘A Hierarchy of Abstraction’ but the article focuses on levels of intelligence. The article claims that intelligence positively correlates with the ability to handle high level abstractions, but it does not talk about actual hierarchies of abstraction. For example, I’d expect a hierarchy of abstraction to contain things like: concrete objects, imagined concrete objects, classes of concrete objects, concrete processes, simulated processes, etc. A more accurate title might be ‘The Ability to Understand and Use Abstractions in Computer Science as a Measure of Intelligence.’
The article lays out a way of measuring fluid intelligence but does not decouple the crystallized intelligence requirements from the fluid ones. For example, ‘Understands Recursion’ specifies needing to implement a specific algorithm recursively as a requirement. There are plenty of people who understand and use recursion regularly who do not know that algorithm. (take me) Let’s say you test them and they fail. Did they fail because of their fluid intelligence? Did they fail because of a lack of crystallized knowledge related to that specific problem? Did they fail because of abstraction capability requirements in that specific problem, but not recursion in general?
What about recursion as a concept makes it hard for people to understand? I would recommend trying to generalize the requirements more. I would recommend exploring other possible attributions of failure other then fluid intelligence. If the article examined the components of recursion it would be more interesting and compelling. What are the components?
Drilling down into the components of any of these tests will reveal a lot of context and crystallized knowledge that the article may be taking for granted. (curse of knowledge bias) You might be seeing someone struggle with recursion, and the problem isn’t that they can’t understand recursion, its that they don’t have crystallized knowledge of a building block. As someone who understands recursion to a reasonable level, I’d like to see the article point at the key idea behind recursion that people have trouble grasping. Are there a sequence of words that the article can specify where someone understands what each word means, but finds the overall sentence ineffable? Or perhaps they can parrot it back, but they can’t apply it to a novel problem. A requirement of this hypothesis is that someone has all prerequisite crystallized knowledge, but still cannot solve the problem. Otherwise these are not ‘hard’ boundaries of fluid intelligence.
I guess you primarily deal with computers and programming. One way to try and generalize this quickly would be to compare notes across disciplines and identify the pattern. Is there a ‘cannot learn pointers’ in chemistry for example?
I understand that you are trying to share the gist of an idea, but I think these are things that should be further examined if you want other people to take on this mental model. Much more needs to be said and examined in an article that lays out 10 specific levels with specific tests.
I’d also be wary of the possibility this entire framework / system looks good because it positions your tribe as superior (computer programmers) and possibly you somewhere comfortably towards the top.
This article triggered me emotionally because I think one of the things that prevents people from learning things is the belief that they can’t. I wouldn’t want anyone to take away from this article that because they didn’t understand pointers or recursion at some point in there life, it was because they are dumb and should stop trying.
As I pointed out in my othercomments, these levels are about potential, not ability. The question are not really about “Can you answer x?” They are more along the lines of “Can you easily learn to answer x?” I believe this decouples “the crystallized intelligence requirements from the fluid ones”.
What about recursion as a concept makes it hard for people to understand?…I’d like to see the article point at the key idea behind recursion that people have trouble grasping.
Recursion requires a person to hold two layers of abstraction in zeir mind simultaneously. This could require twice as much working memory. Working memory, unlike crystallized intelligence, is something we cannot do much to improve.
Is there a ‘cannot learn pointers’ in chemistry for example?
I have never tutored chemistry. But I have tutored physics for years. I have never run into anything like “cannot learn pointers” in standard undergraduate physics. The only time I encounter anything like “cannot learn pointers” is when I discuss my weird crackpot theories of relational quantum gravity and entropic time.
The ideas involved in undergraduate computer science are simpler than the ideas involved in undergraduate physics. The difference between undergraduate computer science and undergraduate physics is that undergraduate physics is a set of solutions to a set of problems. It requires little creativity. You can (relatively speaking) learn everything by rote. While fluid intelligence helps you learn rote knowledge faster, fluid intelligence tends not to put a hard cap on total rote knowledge acquisition.
On the other hand, computer science requires a person to solve novel (albeit simpler) problems all the time. It makes sense one’s ability to do this could be limited by fluid intelligence.
The other difference between computer science and physics is that in physics you never have to hold two layers of abstraction in your head at the same time. It makes sense that holding two layers of abstraction in your head at the same time could be limited by working memory. One’s ability to solve novel problems is the definition of fluid intelligence. Fluid intelligence is correlated with working memory.
My brief forays into chemistry suggest chemistry involves more rote knowledge and less “juggling multiple layers of abstraction” than computer science does. Chemistry, like all subjects, is limited by intelligence at some level. But I would expect the floor is lower than physics and computer science.
I’d also be wary of the possibility this entire framework / system looks good because it positions your tribe as superior (computer programmers) and possibly you somewhere comfortably towards the top.
Really? I do not know a single computer programmer who meets #9. I know only one who passes #8. Most of my programmer friends do not even pass #7. This framework puts quants on top. Not computer programmers. I do not think anyone would doubt that quants are smart people—at least in the sense relevant to this post.
Computer programmers tend to be superior at computer programming (and perhaps related fields).
This article triggered me emotionally because I think one of the things that prevents people from learning things is the belief that they can’t. I wouldn’t want anyone to take away from this article that because they didn’t understand pointers or recursion at some point in there life, it was because they are dumb and should stop trying.
Among the most important things I have learned from Less Wrong is the value of epistemic rationality over instrumental rationality. It breaks my heart that certain people seem unable to perform certain tasks I consider trivial. The greater cruelty is blaming someone for their failure to achieve a goal zey lack the potential to accomplish.
The title is ‘A Hierarchy of Abstraction’ but the article focuses on levels of intelligence. The article claims that intelligence positively correlates with the ability to handle high level abstractions, but it does not talk about actual hierarchies of abstraction. For example, I’d expect a hierarchy of abstraction to contain things like: concrete objects, imagined concrete objects, classes of concrete objects, concrete processes, simulated processes, etc. A more accurate title might be ‘The Ability to Understand and Use Abstractions in Computer Science as a Measure of Intelligence.’
The article lays out a way of measuring fluid intelligence but does not decouple the crystallized intelligence requirements from the fluid ones. For example, ‘Understands Recursion’ specifies needing to implement a specific algorithm recursively as a requirement. There are plenty of people who understand and use recursion regularly who do not know that algorithm. (take me) Let’s say you test them and they fail. Did they fail because of their fluid intelligence? Did they fail because of a lack of crystallized knowledge related to that specific problem? Did they fail because of abstraction capability requirements in that specific problem, but not recursion in general?
What about recursion as a concept makes it hard for people to understand? I would recommend trying to generalize the requirements more. I would recommend exploring other possible attributions of failure other then fluid intelligence. If the article examined the components of recursion it would be more interesting and compelling. What are the components?
Drilling down into the components of any of these tests will reveal a lot of context and crystallized knowledge that the article may be taking for granted. (curse of knowledge bias) You might be seeing someone struggle with recursion, and the problem isn’t that they can’t understand recursion, its that they don’t have crystallized knowledge of a building block. As someone who understands recursion to a reasonable level, I’d like to see the article point at the key idea behind recursion that people have trouble grasping. Are there a sequence of words that the article can specify where someone understands what each word means, but finds the overall sentence ineffable? Or perhaps they can parrot it back, but they can’t apply it to a novel problem. A requirement of this hypothesis is that someone has all prerequisite crystallized knowledge, but still cannot solve the problem. Otherwise these are not ‘hard’ boundaries of fluid intelligence.
I guess you primarily deal with computers and programming. One way to try and generalize this quickly would be to compare notes across disciplines and identify the pattern. Is there a ‘cannot learn pointers’ in chemistry for example?
I understand that you are trying to share the gist of an idea, but I think these are things that should be further examined if you want other people to take on this mental model.
Much more needs to be said and examined in an article that lays out 10 specific levels with specific tests.
I’d also be wary of the possibility this entire framework / system looks good because it positions your tribe as superior (computer programmers) and possibly you somewhere comfortably towards the top.
This article triggered me emotionally because I think one of the things that prevents people from learning things is the belief that they can’t. I wouldn’t want anyone to take away from this article that because they didn’t understand pointers or recursion at some point in there life, it was because they are dumb and should stop trying.
As I pointed out in my other comments, these levels are about potential, not ability. The question are not really about “Can you answer x?” They are more along the lines of “Can you easily learn to answer x?” I believe this decouples “the crystallized intelligence requirements from the fluid ones”.
Recursion requires a person to hold two layers of abstraction in zeir mind simultaneously. This could require twice as much working memory. Working memory, unlike crystallized intelligence, is something we cannot do much to improve.
I have never tutored chemistry. But I have tutored physics for years. I have never run into anything like “cannot learn pointers” in standard undergraduate physics. The only time I encounter anything like “cannot learn pointers” is when I discuss my weird crackpot theories of relational quantum gravity and entropic time.
The ideas involved in undergraduate computer science are simpler than the ideas involved in undergraduate physics. The difference between undergraduate computer science and undergraduate physics is that undergraduate physics is a set of solutions to a set of problems. It requires little creativity. You can (relatively speaking) learn everything by rote. While fluid intelligence helps you learn rote knowledge faster, fluid intelligence tends not to put a hard cap on total rote knowledge acquisition.
On the other hand, computer science requires a person to solve novel (albeit simpler) problems all the time. It makes sense one’s ability to do this could be limited by fluid intelligence.
The other difference between computer science and physics is that in physics you never have to hold two layers of abstraction in your head at the same time. It makes sense that holding two layers of abstraction in your head at the same time could be limited by working memory. One’s ability to solve novel problems is the definition of fluid intelligence. Fluid intelligence is correlated with working memory.
My brief forays into chemistry suggest chemistry involves more rote knowledge and less “juggling multiple layers of abstraction” than computer science does. Chemistry, like all subjects, is limited by intelligence at some level. But I would expect the floor is lower than physics and computer science.
Really? I do not know a single computer programmer who meets #9. I know only one who passes #8. Most of my programmer friends do not even pass #7. This framework puts quants on top. Not computer programmers. I do not think anyone would doubt that quants are smart people—at least in the sense relevant to this post.
Computer programmers tend to be superior at computer programming (and perhaps related fields).
Among the most important things I have learned from Less Wrong is the value of epistemic rationality over instrumental rationality. It breaks my heart that certain people seem unable to perform certain tasks I consider trivial. The greater cruelty is blaming someone for their failure to achieve a goal zey lack the potential to accomplish.