I don’t have good evidence. Note that the space of all possible problems is very large; most problems are ones that either all humans could solve trivially, or all humans would fail to solve. You aren’t necessarily going to get a nice clean “window” into the space of all possible problems so that all problems such that c < difficulty(problem) < d are in your window; you might have the situation P(problem is in your view) ~ 1 / difficulty(problem). Suppose that we then define problem difficulty in terms of algorithm runtime or minimal program length, and define educational level as being proportional to the difficulty of problems solvable by a person of that educational level. Suppose that the number of problems in problem-space nps(edu) that someone with education edu can solve is nps(edu) = edu^2. The number of viewable problems that person can solve is only npv(edu) = ( edu^2 / difficulty ) ~ edu, and would appear to us to be linear in the set of problems faced.
So the answer probably depends on what subset of problems we face. I believe that we continually make our society as complex as we can (to improve efficiency) while maintaining specialists in every necessary area who can deal with most of the problems arising in that area.
So it might be that, within an area of expertise (say, metallurgy), you’d find that most competent metallurgists can solve 90% of the set of problems they consider. There aren’t enough unsolvable problems in the space to detect an exponential increase. But most non-metallurgists might be able to solve 2% of them. (Totally made-up figures.)
If we suppose that politics is an area in which practitioners are chosen for their ability to get elected rather than expertise in problem-solving, and that, the subset of problems under consideration being set by the same kind of process as for metallurgy, we might expect that 90% of political problems would be solvable by someone with the right education, but that only 5% can be solved by the typical politician. If we suppose that the 2%-solving politician is only 1 standard deviation below the 5%-solving politician, then, under the theory that number of problems solved increases linearly or less with ability/education/etc, the atypical 90%-solving politician would have to be so many SDs above the 5%-solving politician, that none would exist. So, by contradiction, the relationship must be more than linear.
A weakness with this argument is that I just guessed all the numbers right now.
I think the intuition behind my saying this was that the number of possible programs you can run increases exponentially in the size of your Turing machine’s tape. Size of your tape ~ your education.
Another approach is to define the difficulty of a problem in terms of the length or runtime of a program that can solve it. You then find that n(diff), the number of problems that exist at a given difficulty level, is exponential in diff.
Have there been studies of how worker productivity is distributed? We should at least be able to get economy-wide income data, which gives us nearly the same info if we assume peoples’ pay tracks the value they add.
In math, it does feel as though one’s mathematical power is something like exponential in the amount of time one has spent on solid math study, at least until one hits the frontiers of one’s subfield. Algebra I takes a year to learn, but a few years later, the content of algebra I seems similar to the content of a section (not even a chapter) within a semester’s course. I suspect similar increases in my ability to learn other competencies as I “learn how to learn” those fields, but it is harder to quantify. Do any of you programmers, or others, care to estimate how your productivity has changed with focused efforts to learn, or to learn how to learn?
Math feeds on itself. It takes a simple concept, and then examines it from a dozen points of view, seeing more and more structure in the idealized problems, both on formal and intuitive levels. As a result, having learned some math, you can more easily anticipate new structure in the new math, and in other problems, as on the intuitive level, it generalizes very well because of the simplicity of constructions that get studied.
Educators must know a great deal about the effect of mathematical sophistication. It’s interesting how undergrad textbooks, even on the subjects I know nothing about, seem boring and longwinded, and it’s often more instructive to just find a tutorial paper, tapping greater depth through keywords and references.
the content of algebra I seems similar to the content of a section
That’s probably because the real content, eg, the idea of a variable, is invisible to you now. To someone who already knows it, it can’t be drawn out any longer than a section. There is a phrase “mathematical sophistication” for such content that’s hard to pin down. Also, people may teach it inefficiently, as they don’t remember what the gap is.
I think a High School Algebra I course takes a year because it is designed for students who are not interested in math. The Algebra I students who will go on to take higher level college courses likely would be able to assimilate the early material much faster if it was expected of them. That the advanced classes proceed at a higher speed could reflect that earlier classes have weeded out the students without the ability and interest to do so.
Did it take a year because it really took that long to understand the material, or because the class took a year to present it to you?
Of course, age is also a factor, an adult can concentrate on a subject for longer than a child can. This might be better illuminated by change in the rate of self directed learning.
It really took me roughly that long, although it was more conceptually deep than most algebra courses. I learned most of my math at my own pace, with help from my dad. My non-confident guess is that most mathematically talented people encounter algebra and other subjects long after they’re ready for them, and therefore learn them fairly rapidly but at the cost of having wasted time earlier on. But I may just have been slow.
In any case, even restricting to bright college students other than me, I’ve watched multiple individuals get much faster at learning math over the course of undergrad.
Um, well, I was simplifying. Algebra 1 I learned between 2nd and 5th grade, mostly incidentally but not especially quickly, in the course of asking my dad about probability, basic number theory (rational and irrational numbers; modular arithmetic and divisibility facts; etc.) and other topics of interest. (Much of algebra 1 was harder than, say, Bayes’ theorem, which is not the case for high school students. It’s as though some skills were online while others, especially formal/schematic others, weren’t.)
Algebra 2 I learned in sixth grade, in a normal course (for 8th graders in the gifted program). It wasn’t too slow, though, or not by much. I came in with less than a full Algebra 1 worth of background, struggled a bit the first semester, did fine the second. Geometry I learned in 7th grade, working from a book (they let me do my own thing in math class) with help from my dad, and doing more exploration and proofs than the book included. I spent maybe 2/3rds of the year on it, then did some trig and basic discrete math.
Which suggests a fairly normal rate of learning, though with deeper exploration and at a younger age. I would non-confidently guess that many of those who go on to study math would have been similar as kids, if given the opportunity. Kids have less ability to hold formal scaffolds in their heads, and, as Douglas Knight notes, it’s hard as adults to see how large the cognitive distance is.
Um, well, I was simplifying. Algebra 1 I learned between 2nd and 5th grade, mostly incidentally but not especially quickly, in the course of asking my dad about probability, basic number theory (rational and irrational numbers; modular arithmetic and divisibility facts; etc.), the limit of 1⁄2 + 1⁄4 + 1⁄8 + …, and other topics of interest. (Algebra 1 was harder for me than, say, Bayes’ theorem; which I don’t think is the case for most high schoolers. It’s as though some cognitive skills were online and others weren’t. Especially, the formal/schematic ones weren’t.)
Algebra 2 I learned in sixth grade, in a normal course (for 8th graders in the gifted program). It wasn’t too slow, though, or not by much. I came in with less than a full Algebra 1 worth of background, struggled a bit the first semester, did fine the second. Geometry I learned in 7th grade, working from a book (they let me do my own thing in math class) with help from my dad, and doing more exploration and proofs than the book included. I spent maybe 2/3rds of the year on it, then did some trig and basic discrete math.
Which suggests a fairly normal rate of learning, though with deeper exploration and at a younger age. I would non-confidently guess that many of those who go on to study math would have been similar as kids, if given the opportunity. Kids have less ability to hold formal scaffolds in their heads, and, as Douglas Knight notes, it’s hard as adults to see how large the cognitive distance is.
My non-confident guess is that most mathematically talented people encounter algebra and other subjects long after they’re ready for them, and therefore learn them fairly rapidly but at the cost of having wasted time earlier on.
That might explain my experience in tutoring my cousin in math. I find he is able to catch up quickly once I explain the background material a given concept is based on. So, if he had been ready for some time to learn the background material, then learning it when I present it is not a big deal and doesn’t even noticeably detract from the effort and focus he needs to understand the new concept he is supposed to be learning.
Pay tracking value added is an extremely unlikely proposition. There are too many confounding factors that would swamp the effect while being very difficult to control for.
At any rate, it’s considered common knowledge that great programmers are an order of magnitude more productive than average programmers, and that truly bad programmers can acheive net-negative productivity.
To what extent that has solid supporting evidence vs. a lot of anecdotes, I don’t know.
I’m puzzled by a repeated Internet startup pattern:
One or two founders build an application or a website.
Website/app catches on. VCs invest money.
Company grows to employ dozens of people, without much improvement in the product.
You could read this as meaning that the founders were great programmers, who then hired average programmers. Or it could mean that the product is only a small fraction of the value of a company, and the other people do graphic design, public relations, marketing, advertising, business deals, accounting, and managing.
(It’s surprising that companies that developed much of their software product with a few people frequently go out of business having dozens of people on their payroll, when you’d think they could just fire those people to become profitable. Do VCs make companies grow too fast?)
A bit of both, I expect. However, you do neglect the legacy code effect; two great coders writing a new system that basically works can happen amazingly fast, while a larger team doing maintenance and enhancements on an existing code base takes a lot longer.
And yes, they probably make them grow too fast. The anecdotes I’ve read about life inside a startup suggest that most VCs are actively harmful to companies at every point except while signing checks. However, it may very well be a rational strategy for the VCs, because a lot of dead startups and one huge success makes them more money than a handful of moderately successful companies and no smash-hits.
I’ve seen this pattern with growing teams within large companies. I believe there is some research on the phenomenon in the software engineering / project management literature that suggests the rapid decrease in communication efficiency as team size increases beyond a fairly small number of individuals is the root cause of the problem. Companies or teams that grow slowly can sometimes adopt new methods to efficiently coordinate larger and larger groups so that total productivity continues to increase even as average productivity per individual declines but an all too common failure pattern is that total productivity actually declines as the team or company grows.
Plausible, and very important if so. Why do you expect this? What evidence weighs for and against?
I don’t have good evidence. Note that the space of all possible problems is very large; most problems are ones that either all humans could solve trivially, or all humans would fail to solve. You aren’t necessarily going to get a nice clean “window” into the space of all possible problems so that all problems such that c < difficulty(problem) < d are in your window; you might have the situation P(problem is in your view) ~ 1 / difficulty(problem). Suppose that we then define problem difficulty in terms of algorithm runtime or minimal program length, and define educational level as being proportional to the difficulty of problems solvable by a person of that educational level. Suppose that the number of problems in problem-space nps(edu) that someone with education edu can solve is nps(edu) = edu^2. The number of viewable problems that person can solve is only npv(edu) = ( edu^2 / difficulty ) ~ edu, and would appear to us to be linear in the set of problems faced.
So the answer probably depends on what subset of problems we face. I believe that we continually make our society as complex as we can (to improve efficiency) while maintaining specialists in every necessary area who can deal with most of the problems arising in that area.
So it might be that, within an area of expertise (say, metallurgy), you’d find that most competent metallurgists can solve 90% of the set of problems they consider. There aren’t enough unsolvable problems in the space to detect an exponential increase. But most non-metallurgists might be able to solve 2% of them. (Totally made-up figures.)
If we suppose that politics is an area in which practitioners are chosen for their ability to get elected rather than expertise in problem-solving, and that, the subset of problems under consideration being set by the same kind of process as for metallurgy, we might expect that 90% of political problems would be solvable by someone with the right education, but that only 5% can be solved by the typical politician. If we suppose that the 2%-solving politician is only 1 standard deviation below the 5%-solving politician, then, under the theory that number of problems solved increases linearly or less with ability/education/etc, the atypical 90%-solving politician would have to be so many SDs above the 5%-solving politician, that none would exist. So, by contradiction, the relationship must be more than linear.
A weakness with this argument is that I just guessed all the numbers right now.
I think the intuition behind my saying this was that the number of possible programs you can run increases exponentially in the size of your Turing machine’s tape. Size of your tape ~ your education.
Another approach is to define the difficulty of a problem in terms of the length or runtime of a program that can solve it. You then find that n(diff), the number of problems that exist at a given difficulty level, is exponential in diff.
Have there been studies of how worker productivity is distributed? We should at least be able to get economy-wide income data, which gives us nearly the same info if we assume peoples’ pay tracks the value they add.
In math, it does feel as though one’s mathematical power is something like exponential in the amount of time one has spent on solid math study, at least until one hits the frontiers of one’s subfield. Algebra I takes a year to learn, but a few years later, the content of algebra I seems similar to the content of a section (not even a chapter) within a semester’s course. I suspect similar increases in my ability to learn other competencies as I “learn how to learn” those fields, but it is harder to quantify. Do any of you programmers, or others, care to estimate how your productivity has changed with focused efforts to learn, or to learn how to learn?
Math feeds on itself. It takes a simple concept, and then examines it from a dozen points of view, seeing more and more structure in the idealized problems, both on formal and intuitive levels. As a result, having learned some math, you can more easily anticipate new structure in the new math, and in other problems, as on the intuitive level, it generalizes very well because of the simplicity of constructions that get studied.
Educators must know a great deal about the effect of mathematical sophistication. It’s interesting how undergrad textbooks, even on the subjects I know nothing about, seem boring and longwinded, and it’s often more instructive to just find a tutorial paper, tapping greater depth through keywords and references.
That’s probably because the real content, eg, the idea of a variable, is invisible to you now. To someone who already knows it, it can’t be drawn out any longer than a section. There is a phrase “mathematical sophistication” for such content that’s hard to pin down. Also, people may teach it inefficiently, as they don’t remember what the gap is.
I think a High School Algebra I course takes a year because it is designed for students who are not interested in math. The Algebra I students who will go on to take higher level college courses likely would be able to assimilate the early material much faster if it was expected of them. That the advanced classes proceed at a higher speed could reflect that earlier classes have weeded out the students without the ability and interest to do so.
I loved math, and am talented at it, and it still took me a year. It was just a year at a much younger age.
Did it take a year because it really took that long to understand the material, or because the class took a year to present it to you?
Of course, age is also a factor, an adult can concentrate on a subject for longer than a child can. This might be better illuminated by change in the rate of self directed learning.
It really took me roughly that long, although it was more conceptually deep than most algebra courses. I learned most of my math at my own pace, with help from my dad. My non-confident guess is that most mathematically talented people encounter algebra and other subjects long after they’re ready for them, and therefore learn them fairly rapidly but at the cost of having wasted time earlier on. But I may just have been slow.
In any case, even restricting to bright college students other than me, I’ve watched multiple individuals get much faster at learning math over the course of undergrad.
Just how old were you when you studied it?
Um, well, I was simplifying. Algebra 1 I learned between 2nd and 5th grade, mostly incidentally but not especially quickly, in the course of asking my dad about probability, basic number theory (rational and irrational numbers; modular arithmetic and divisibility facts; etc.) and other topics of interest. (Much of algebra 1 was harder than, say, Bayes’ theorem, which is not the case for high school students. It’s as though some skills were online while others, especially formal/schematic others, weren’t.)
Algebra 2 I learned in sixth grade, in a normal course (for 8th graders in the gifted program). It wasn’t too slow, though, or not by much. I came in with less than a full Algebra 1 worth of background, struggled a bit the first semester, did fine the second. Geometry I learned in 7th grade, working from a book (they let me do my own thing in math class) with help from my dad, and doing more exploration and proofs than the book included. I spent maybe 2/3rds of the year on it, then did some trig and basic discrete math.
Which suggests a fairly normal rate of learning, though with deeper exploration and at a younger age. I would non-confidently guess that many of those who go on to study math would have been similar as kids, if given the opportunity. Kids have less ability to hold formal scaffolds in their heads, and, as Douglas Knight notes, it’s hard as adults to see how large the cognitive distance is.
Um, well, I was simplifying. Algebra 1 I learned between 2nd and 5th grade, mostly incidentally but not especially quickly, in the course of asking my dad about probability, basic number theory (rational and irrational numbers; modular arithmetic and divisibility facts; etc.), the limit of 1⁄2 + 1⁄4 + 1⁄8 + …, and other topics of interest. (Algebra 1 was harder for me than, say, Bayes’ theorem; which I don’t think is the case for most high schoolers. It’s as though some cognitive skills were online and others weren’t. Especially, the formal/schematic ones weren’t.)
Algebra 2 I learned in sixth grade, in a normal course (for 8th graders in the gifted program). It wasn’t too slow, though, or not by much. I came in with less than a full Algebra 1 worth of background, struggled a bit the first semester, did fine the second. Geometry I learned in 7th grade, working from a book (they let me do my own thing in math class) with help from my dad, and doing more exploration and proofs than the book included. I spent maybe 2/3rds of the year on it, then did some trig and basic discrete math.
Which suggests a fairly normal rate of learning, though with deeper exploration and at a younger age. I would non-confidently guess that many of those who go on to study math would have been similar as kids, if given the opportunity. Kids have less ability to hold formal scaffolds in their heads, and, as Douglas Knight notes, it’s hard as adults to see how large the cognitive distance is.
That might explain my experience in tutoring my cousin in math. I find he is able to catch up quickly once I explain the background material a given concept is based on. So, if he had been ready for some time to learn the background material, then learning it when I present it is not a big deal and doesn’t even noticeably detract from the effort and focus he needs to understand the new concept he is supposed to be learning.
Pay tracking value added is an extremely unlikely proposition. There are too many confounding factors that would swamp the effect while being very difficult to control for.
At any rate, it’s considered common knowledge that great programmers are an order of magnitude more productive than average programmers, and that truly bad programmers can acheive net-negative productivity.
To what extent that has solid supporting evidence vs. a lot of anecdotes, I don’t know.
I’m puzzled by a repeated Internet startup pattern:
One or two founders build an application or a website.
Website/app catches on. VCs invest money.
Company grows to employ dozens of people, without much improvement in the product.
You could read this as meaning that the founders were great programmers, who then hired average programmers. Or it could mean that the product is only a small fraction of the value of a company, and the other people do graphic design, public relations, marketing, advertising, business deals, accounting, and managing.
(It’s surprising that companies that developed much of their software product with a few people frequently go out of business having dozens of people on their payroll, when you’d think they could just fire those people to become profitable. Do VCs make companies grow too fast?)
A bit of both, I expect. However, you do neglect the legacy code effect; two great coders writing a new system that basically works can happen amazingly fast, while a larger team doing maintenance and enhancements on an existing code base takes a lot longer.
And yes, they probably make them grow too fast. The anecdotes I’ve read about life inside a startup suggest that most VCs are actively harmful to companies at every point except while signing checks. However, it may very well be a rational strategy for the VCs, because a lot of dead startups and one huge success makes them more money than a handful of moderately successful companies and no smash-hits.
I’ve seen this pattern with growing teams within large companies. I believe there is some research on the phenomenon in the software engineering / project management literature that suggests the rapid decrease in communication efficiency as team size increases beyond a fairly small number of individuals is the root cause of the problem. Companies or teams that grow slowly can sometimes adopt new methods to efficiently coordinate larger and larger groups so that total productivity continues to increase even as average productivity per individual declines but an all too common failure pattern is that total productivity actually declines as the team or company grows.