Human performance, psychometry, and baseball statistics
I. Performance levels and age
Human ambition for achievement in modest measure gives meaning to our lives, unless one is an existentialist pessimist like Schopenhauer who taught that life with all its suffering and cruelty simply should not be. Psychologists study our achievements under a number of different descriptions—testing for IQ, motivation, creativity, others. As part of my current career transition, I have been examining my own goals closely, and have recently read a fair amount on these topics which are variable in their evidence.
A useful collection of numerical data on the subject of human performance is the collection of Major League Baseball player performance statistics—the batting averages, number home runs, runs batted in, slugging percentage—of the many thousands of participants in the hundred years since detailed statistical records have been kept and studied by the players, journalists, and fans of the sport. The advantage of examining issues like these from the angle of Major League Baseball player performance statistics is the enormous sample size of accurately measured and archived data.
The current senior authority in this field is Bill James, who now works for the Boston Red Sox; for the first twenty-five years of his activity as a baseball statistician James was not employed by any of the teams. It took him a long time to find a hearing for his views on the inside of the industry, although the fans started buying his books as soon as he began writing them.
In one of the early editions of his Baseball Abstract, James discussed the biggest fallacies that managers and executives held regarding the achievements of baseball players. He was adamant about the most obvious misunderstood fact of player performance: it is sharply peaked at age 27 and decreases rapidly, so rapidly that only the very best players were still useful at the age of 35. He was able to observe only one executive that seemed to intuit this—a man whose sole management strategy was to trade everybody over the age of 30 for the best available player under the age of 30 he could acquire.
There is a fair amount of more formal academic research on this issue. It is described in the literature as the field of Age and Achievement. The dean of the psychologists studying Age and Achievement is Dean Simonton. A decent overview of their findings is here. This is a meta-study of hundreds of individual studies. Many fields and many metrics are sampled. There is one repeated finding. Performance starts low at a young age and steadily increases along a curve which bears a resemblance to a Gaussian bell-shaped curve, peaks, and then declines. The decline is not as rapid as the rise (it is not a symmetric bell shape; it is steeply inclining from the left to the peak and gently declining form the peak to the right), but it is inevitably seen everywhere. The age of peak achievement varies, depending on the field. Baseball players peak at 27 (the curves from the psychology publications look exactly like the curve published by Bill James in his Abstract), business executives peak at 60, and physicists peak at age 35. Shakespearian actors peak late and rock stars peak early. These are statistical results and individual outliers abound. You, the individual physicist, may not be over the hill at 40, but this is the way to bet.
My hometown major league baseball franchise, the Houston Astros, recently had this empirical law verified for themselves in real time, and the hard way. They invested the bulk of their payroll budget on three players: Miguel Tejada, Carlos Lee, and Lance Berkman. All three were over the age of 30, i.e., definitely into their decline phase. When their performance declined more rapidly than predicted, the team lost many more games than they were planning for. They had a contending team’s payroll and big plans, but now Tejada and Berkman are gone and they are rebuilding. In an attempt to cut losses, they traded their (prime-age) star pitcher for young players.
A recent post on Hacker News, Silicon Valley’s Dark Secret: It’s all about Age, generated 120 comments of heated discussion about institutional age discrimination and the unappreciated value of experience. The consensus view expressed there is young programmers have to advance into management or become unemployable near age 50.
It could perhaps be seen as an example of Evolutionary Biology. We are in an ecosystem. The ecosystem selects for fitness. What is sometimes misunderstood is the ecosystem does not select for absolute fitness, but for fitness specific to a niche. If the available niches in this “ecosystem” are for 40 year-old-brains, and there aren’t any niches for 50 year-old-brains, then some fully fit brains (in an absolute sense) are going to be out of employment opportunities. Faced with a system like this, the job seeker may have to be clever at finding ever narrower niches to squeeze themself into.
One of the moderators at Hacker News, Paul Graham, is a software startup venture capitalist. He is accused in the thread of unconcealed age discrimination—that he will not invest in entrepreneurs over 38, and claiming that nobody over 25 will ever learn Lisp. If you are a forty-year-old physicist and you want to learn Lisp and get venture capital funding for your business plan—well, good luck with that!
II. Time to mastery
This leads directly into my second topic within my larger subject of human performance, psychometry, and baseball statistics. Learning curves and estimated time for mastery. To continue with the above example, assuming you want to master Lisp, how much of your time should you plan to allocate for the task? K. Anders Ericson is the author of the relevant research findings. At a crude level of approximation, something like that takes ten thousand hours. This is a result I was first exposed to many years ago in the context of Buddhist meditation, in an Esalen conference presented by Helen Palmer (mostly known for her work on the Eneagram). She reported that to become skilled at Zen meditation requires ten thousand hours of practice. In the University of Wisconsin brain imaging meditation study, the subjects were Tibetan monks who had all logged a minimum of ten thousand hours of practice. The ten thousand hours of practice requirement was also reported popularly by Malcom Gladwell in his best-selling book Outliers. Another take on this: Teach Yourself Programming in Ten Years. Ten thousand hours of 40-hour-weeks is five years, not ten; the number is not precise, but the idea is consistent that ambitious projects take a daunting amount of time.
One of my dance teachers was fond of reminding me that practice does not make perfect. Only perfect practice can make you perfect. For most of us even that is an exaggeration. I think we can reliably predict that ten thousand hours of very good practice will make you very good if you first possess an average or above-average amount of raw aptitude..
III. Distribution of performance across a population, replacement-level player
The second biggest fallacy among baseball personnel managers, according to Bill James, is they do not understand how ability is distributed amongst professional baseball players. He defines the concept of replacement-level player, and insists the vast majority of the fellows working in the Major Leagues are easily, quickly, replaceable. His reasoning is simple.
If you have a random selection of humans and measure nearly any measurable trait—height, weight, speed, strength, reflex time—the frequency plot will be the familiar bell shape Gaussian curve. People playing baseball professionally are an extreme non-random sample. 98% of the left-hand portion of the curve is gone, because none of those people have the physical requirements to get employment playing baseball. The resulting distribution is a truncated Gaussian distribution, with few at the highest levels, and the vast majority of participants of nearly indistinguishable quality. When performance is creamed at stage after stage after stage, little league to high school to college to minor leagues to the majors, almost all the remaining players are excellent and interchangeable.
If you are managing a corporation and you only hire candidates with golden resumes you have a truncated Gaussian distribution of talent. If in your evaluation process you shove those people into a Gaussian distribution, Bill James says you are doing it totally wrong. Another common mistake is that managers think there is something magical about “major league” talent, that some guys have it (as Thomas Wolfe referred to the “right stuff”) and some do not, and they mislabel players who could help them win baseball games as not having it, due to the circumstantial variations of where the players have found themselves employed up until now. Organizations that hire top talent and pay high salaries have far more options than they generally presume. Nearly every single person working for your company is easily replaceable.
There is a story, possibly apocryphal, about Benoit Mandelbrot and his early preoccupation with financial market data. His questioner thought finance was a fuzzy science and hard scientific data really ought to be much more attractive to his scientific temperament. Mandelbrot explained that the great feature of studying financial data was that there was so much of it, and it was thus endlessly fascinating. Many statisticians have a similar fondness for baseball statistics. It is reliably recorded, unambiguous in definition, and there is so much of it. Many subtle statistics results are best explained in the context of baseball statistics, and there may be unknown statistical theorems sitting in the archives waiting to be extracted by clever statisticians. The wikipedia page on Stein’s paradox (first published by Charles Stein in 1956) has a reference to a well-known (well-known to baseball statisticians, anyway) article from the May 1977 issue of Scientific American using baseball statistics to illustrate Stein’s paradox.
After my article was nearly finished, I stumbled upon this “news” in the New York Times Sports section:
Sniffing .300, Hitters Hunker Down on Last Chances. (Here they are presenting research from a couple of economists from U. Pennsylvania’s Wharton School of Business. The academic publication is here.)
The preceding should be of interest to anybody who is interested in the subjects of human achievement, psychometry and baseball statistics. My own interest is narrower and the lesson I personally draw is a hybrid from the sequence of lessons here. I have an ambitious scope for the company I am building. Ten thousand hours is close to the limit I am choosing for myself as the point when I will write off these lessons and losses (if they be) and go back to rejoin the American corporation employment market.
- 21 Feb 2011 4:01 UTC; 4 points) 's comment on Age, fluid intelligence, and intelligent posts by (
- 29 Dec 2011 18:40 UTC; 3 points) 's comment on Are Yearly/Monthly Book Suggestion Threads a Good Idea? by (
What’s with the 10000 hours figure?
Some things are shallower than that, right? Even if the peak is always around 10000 hours of smart practice in, If you can get to 99.9% effectiveness after 50 hours, then why continue?
I also doubt that the peak is really so close to 10000 hours no matter what.
Can somebody summarize the best evidence? It’s even interesting to me if the 10k hours (or whatever, it’s fine if it varies with the discipline) can be spread out or compressed within a wide range of real time with little difference in the outcome.
I suspect it’s a selection effect. If a task can be effectively mastered with a short investment of time (like your example of 50 hours), then it’s not something you can turn into a career. If a task can’t be mastered in less than some large upper bound of hours (say, around 20,000) then it also can’t be turned into a career. Tasks with a mastering time of around 10,000 are the ones that are pragmatic to specialize in and establish comparative advantage, so they are the ones that abound.
I think you’re on the right track, thinking of selection effects. I don’t think the effect you mentioned explains why to expect something requiring very nearly 10k hours, even though there’s some obvious truth to it. I take it you’d put a wide deviation around the (central?) 10k figure.
Careers and skills are rewarded precisely because they’re useful. I think what you’re saying is that prestigious careers are the ones that take many years to develop (5 years full time = 10k hours). I think that’s mostly true if you amend it to include those careers that are more a combination of raw physical/mental gifts and training. It’s also true that if something isn’t sufficiently rewarding, few people will bother spending 10k hours on it.
Here’s the selection effect I like: people who can never become masters (even if they try) drop out well before completing 10k hours of focused practice. If you look at the few failed masters who persist in spending 10k hours, it’s easy to find things about their practice that you can call crazy: “can’t you see that you’re doing this part wrong? why aren’t you working on correcting it, instead spending time on what’s fun or satisfying or already routine?” In fact they may never be able to correct that thing they lack the gifts for.
I think that’s nearly the entire story. Surely plenty of people with high talent nearly-peak (above 99% of their lifetime peak) sooner or (slow learners, or unusually deep potential) later than 10k hours.
I’d like to see what happens with people who are recreational athletes (basketball, volleyball, softball, etc.), not especially gifted in physical attributes, some of whom definitely log the 10k hours. Will it be accurate to say that those who actually drilled and rationally worked on specific skills based on analysis of their game performance, should be masters (limited from being pro-level only by their physical handicap)?
I would. We can see the 10k figure warping and changing over time just for highly specific expertises, like for chess—it’s a regular bit of media coverage how grandmasters seem to be getting younger and (real time) less experienced, but still as good as ever.
(It’s also interesting to note that at the same time as human grandmaster play becomes easier to acquire, truly high-level play has ceased to be just humans and become teams of computers & humans.)
I know of nothing to the contrary. The Cambridge Handbook of expertise mentions that experts are made only by deliberate experimental practice, and that mere experience does not suffice; that amateurs can spend multiple decades at something and because their experience is not the right sort of experience, remain at their plateau.
I’d like to amend my comment; I’ve read further in the Cambridge Handbook and found this quote:
Yes, I would. I’m not deeply invested in the correctness of my exact explanation; I’d find explanations similar to it as plausible. It’s just the idea that came to mind when I actually queried my brain for the answer your original question.
Any theories about why there aren’t more 5,000 hour skills?
Well, my claim that tasks that take 10,000 hours to master are most common was pulled from the general vicinity of my posterior. If I were going to devote more than 4 minutes of thought to the topic I’d want to be a lot more careful about the whole question.
It may also be a point around which diminishing returns become excessive due to the approximate limitations of humans.
One should distinguish between necessary declines and usual declines.
Baseball players decline because their bodies wear out. This is mainly age and there is not much you can do about this. To some extent it is playing time, especially for pitchers (and in other sports, like american football). Maybe lifetime productivity could be improved by allocation of playing time. With physicists, it might be similar, that their brains wear out. Certainly, that is a factor. But are the effects that strong? I don’t think they explain the distribution of work by older people.
An alternative theory to explain the physicists is that people get stuck in a rut, unwilling to learn new things, which may well be susceptible to outside intervention. This brings us to your other point about time to mastery. I’m a fan of Steve Yegge (older), who chronicles his discovery, 10 or 20 years into his programming career, that there was a lot more for him to learn, and how he went about learning it.
Can you link to some of the relevant articles? Right now link points to some kind of parody making of fun of Java open source politics. Not especially inspiring.
I agree, I don’t think the baseball analogy is useful when thinking about fields like computer programming without more careful consideration of the differences
No they don’t. The brain atrophies when it is underused, but it does not wear out. At all. Cognitive ability, and in particular the ability to learn new stuff or adjust to new things declines with age (or “maturity”), but high mental activity protects you against this, so that your “peak” becomes longer and your mental decline slower.
I wonder how the variance in speed for developing a high level of skill in something can be affected by having previously acquired relevant skills.
It was certainly easy for me to learn to play the piano after I learned to play guitar (much easier than it was for me to learn to play the guitar) with a high level of proficiency. I had already developed above average dexterity and I had a conceptual understanding of music that I could apply to the piano. Both of these skills clearly gave me an advantage when approaching the piano.
Is a good way to identify what skills can most readily be developed in to a broad spectrum of abilities once they are mastered? Is there any attempt to map the ‘skill hierarchy’?
Interesting subject, thanks! However, I found it a bit hard to distill the overall message you’d like to convey, i.e., the conclusion. Maybe there should be a final paragraph to bind the threads together?
I am not a psychologist so I am not conveying a message here beyond this: I did a bunch of reading in this area earlier this year and these three items stand out to me as significant and not generally appreciated (I did not appreciate them very much a year ago) and people here might find them interesting.
This is very old, but if I am eyeballing the timeline correctly we should be approaching the point where you are deciding whether to cut your losses or endorse the lessons. So if I may, how did it go?
Follow-up
This did not go too badly. I had this posting 90% written when the Discussion section started so I had already decided to plunge into top post. Next time I am going to post into Discussion first and if there are only fourteen comments I would leave it there.
Most of the comments are about the 10 000 hours topic and I have given this one a little more thought and have a line for possible further inquiry. This is the amount of time it takes to establish the habitual brain cell connections of a programmer, a doctor, a violinist. It takes more than an afternoon to train a dog. And it may take more time to train a wild horse than is worth it. This is a much simpler system than the human brain.
What happens over the training or learning interval is a large complex set of nerve cell connections is established and stabilized through repetition. There was a “most-popular” article in the New York Times a few weeks ago that said latest research findings indicate we should study in a variety of environments to expedite learning.
I have found the techniques in this book useful:
Study is Hard Work by William H. Armstrong.
I just want to say thanks for writing this. The replacement-level player in particular made it immediately obvious once I examined my prejudice that the average drop in quality as you go down a tier in graduates of highly ranked graduate programmes will be very, very low, almost swamped by noise.
If something apart from the Cambridge Handbook of Expert Performance was particularly helpful for figuring out when people peaked or if peaking time was a function of invested time or of age I’d be very interested.
K. Anders Ericson is one of the editors of that Cambridge Handbook and he is also the author of that study linked above. I have not read the handbook yet, but as near as I can tell he is the world’s leading authority on this question and I found that linked paper easy to grok. The quickest way into the literature is probably to follow his citations.
I have not done this yet either. If you do it and find something useful I would like to know about it. I am going through a mid-life career transition and wary of biting off more than I can chew up and swallow.
I’ve also read it several times before that physicists and scientists tend to achieve their best results by their mid-thirties. But I don’t think the characterization necessarily works for physics/math/etc. like it does for baseball and athletics. There’s just a major qualitative difference there—e.g., athletes are forced to retire fairly young, whereas teachers are very rarely forced to retire until they are really nearing the end of their viable lifespan. Although I do agree that in something like physics, there is also a component of “mental athleticism”, which just naturally peaks at a medium or youthful age.
Also, for a lot of subjects like physics or math, you probably won’t be able to have a decent mastery of your work until around, say, age 25-35. So the simple fact of the matter is that you will always be past your peak for the majority of your practicing career. It’s a bit sad, but again, I think it just shows that the concept of “peaking” may not be really as broadly applicable for academic areas.