Very late reply, reading this for the 2022 year in review.
As one example: YCombinator companies have roughly linear correlation between exit value and number of employees, and basically all companies with $100MM+ exits have >100 employees. My impression is that there are very few companies with even $1MM revenue/employee (though I don’t have a data set easily available).
So there are at least two different models which both yield this observation.
The first is that there are few people who can reliably create $1MM / year of value for their company, and so companies that want to increase their revenue have no choice but to hire more people in order to increase their profits.
The second is that it is entirely possible for a small team of people to generate a money fountain which generates billions of dollars in net revenue. However, once you have such a money fountain, you can get even more money out of it by hiring more people, comparative advantage style (e.g. people to handle mandatory but low-required-skill jobs to give the money-fountain-builders more time to do their thing). At equilibrium, companies will hire employees until the marginal increase in profit is equal to the marginal cost of the employee.
My crackpot quantitative model is that the speed with which a team can create value in a single domain scales with approximately the square root of the number of people on the team (i.e. a team of 100 will create 10x as much value as a single person). Low sample size but this has been the case in the handful of (mostly programming) projects I’ve been a part of as the number of people on the team fluctuates, at least for n between 1 and 100 on each project (including a project that started with 1, then grew to ~60, then dropped back down to 5).
Sure, I think everyone agrees that marginal returns to labor diminish with the number of employees. John’s claim though was that returns are non-positive, and that seems empirically false.
Very late reply, reading this for the 2022 year in review.
So there are at least two different models which both yield this observation.
The first is that there are few people who can reliably create $1MM / year of value for their company, and so companies that want to increase their revenue have no choice but to hire more people in order to increase their profits.
The second is that it is entirely possible for a small team of people to generate a money fountain which generates billions of dollars in net revenue. However, once you have such a money fountain, you can get even more money out of it by hiring more people, comparative advantage style (e.g. people to handle mandatory but low-required-skill jobs to give the money-fountain-builders more time to do their thing). At equilibrium, companies will hire employees until the marginal increase in profit is equal to the marginal cost of the employee.
My crackpot quantitative model is that the speed with which a team can create value in a single domain scales with approximately the square root of the number of people on the team (i.e. a team of 100 will create 10x as much value as a single person). Low sample size but this has been the case in the handful of (mostly programming) projects I’ve been a part of as the number of people on the team fluctuates, at least for n between 1 and 100 on each project (including a project that started with 1, then grew to ~60, then dropped back down to 5).
Sure, I think everyone agrees that marginal returns to labor diminish with the number of employees. John’s claim though was that returns are non-positive, and that seems empirically false.