Also I don’t have any reason to accept “bankunderground” as a credible and accurate measurement of progress. Especially as I can just “look out the window” and see that somehow this means all of China’s absurd growth in per citizen productivity just...doesn’t show up in the data. Huh.
The data pertains to Britain, not the developing world and the data comes from The Bank of England. Obviously, tremendous economic progress has been made globally since the 70s. But that economic progress is mostly “catching-up” aka developing countries adopting existing technologies. Far less development has happened on the frontier.
Oh. Well focusing on just britain is meaningless. Why not focus on cuba. Or one household down your block. Point is a “small” country can easily do mediocre for any number of reasons while absurd runaway progress can happen elsewhere.
The same trend can be found in every country which was developed by the 70s. Britain is simply a particularly good example because of the amount of record keeping they performed in the 18th and 19th centuries compared to other countries.
However, just looking at data from the 20th century onward, accessible for any developed country, growth at the technological frontier has slowed tremendously since the 70s. Jason Crawford already aggregated a lot of data pertaining to the slowdown here. Long story short, not much technological progress has been made outside of computing in decades.
I thought about this problem a bit more, and let’s drop speculation about what may or may not be possible in the future. And just talk about specific professions over the last 50 years.
Primary schoolteacher. That person has to give a lesson, in front of a limited number of students as children need personalized attention. So you need 1 teacher per about ~20 students, give or take 10, and that’s their full time job. Computers make it where the teacher can fill out their paperwork more easily, but there is more of it, so it’s about the same. Maybe they do slightly better a job than in the 1970s but the productivity is the same.
Janitor. You have a mop, a broom, brushes, a cart full of supplies. For large open areas there are electric floor washing machines of various types. Pretty sure all of this was available by 70s. Only improvement I know of is that companies don’t have their own staff to do it, they outsource, and the outsourcing firms may get slightly more labor per work week out of their workers.
Retail clerk. Once the bar code and register able to scan the code, look up the price, and add it to a total was readily available, late 70s at the latest, that’s about the limit. A clerk has to scan each item, and credit cards take about the same time as cash.
restaurant waitstaff, cook—totally unchanged
accountant—computers have automated huge swaths of it, but companies are far more complex than they were.
Anyways we can go down the list and find a long list of jobs that have changed minimally if at all. And therefore the productivity per worker cannot be expected to improve if the amount of human labor needed hasn’t shrunk. Some tasks, like nuclear plant worker, over time they have become less productive as more of them are needed per megawatt of power, due to more and more long tail risks being discovered.
And then you can talk about how you might get a meaningful increase in productivity from each of these roles. And, well, it’s all coming up AI. I know of no other way. You must build an automated system able to perform most of the task. Some (like schoolteacher) are nearly impossible, others like janitor are surprisingly hard, and some are being automated as we speak. (Amazon Go for retail clerks)
Some tasks, like nuclear plant worker, over time they have become less productive as more of them are needed per megawatt of power, due to more and more long tail risks being discovered.
It’s not just about discovering more tail risks but about having a different culture on risk in those companies. One example someone from the industry gave me is that they tell their workers in yearly seminars about how to avoid cutting themselves with paper.
Right. So this is one of those anti-progress patterns I see around. What happens internally to the company is that over the Very Serious People create some Very Serious Internal Processes like There Shall be Risk Management Training (on the prevention of papercuts). And anyone suggesting that maybe they could run the company more efficiently by skipping this training has to argue either (1) elders in the company were wrong to institute such training or (2) they (personally) are pro risk.
Hard to be “pro risk” in the same way if you spoke against, say, diversity quotas by definition you are for discrimination.
So over time the company adds more and more cruft—while not really deleting much—making it less and less economically efficient. This is why big rich companies have to constantly be buying startups for the technology, because they are unable to get their own engineers to develop the same tech (because those engineers are too busy/beat down by mandatory training). And why eventually most big rich companies fail, and their assets and technology get bought up by younger companies who, when the merger goes well, basically throw out the legacy trash. (except when the opposite happens, like in the boeing mcdonnell douglas merger)
Sure. Point is that this lets you go from 10 workers in a restaurant to 9.5 or other small increments. It’s not like the innovation of the tractor and fertilizer and other innovations, which have reduced farmers from 50% of the population (1900) to 2% (today).
To get this with restaurants the only way is intelligent robotics, at least the only way I can see. Other than just “everyone stops eating restaurant food and starts eating homogenous soylent packets.” We could automate that fully with today’s tech. Where today a restaurant with 10 workers gets replaced with 0.4 workers, who work offsite, and respond to elevated customer servers calls and elevated maintenance issues. (‘elevated’ means the autonomy tried and failed to solve the issue already. While automated maintenance isn’t too common, Amazon is experimenting with automated customer service, where in my experience a bot will basically just give a refund if you have any complaint at all about an order)
Ok, so first you aren’t talking about progress really, you are linking data on productivity per worker. Which has gone up over the decades but at a slower pace. Why is that?
Well, the simplest theory is that suppose there is a class of tasks that are easy to automate, a second class that is hard but feasible to automate with simple computers, and a set of modestly complex tasks with hundreds of thousands of edge cases.
Well, today, almost none of the improvements in AI you have read about are being used where it counts, in factories and warehouses and mines and to control trucks. This is for several reasons, the biggest one being that for a “small” niche market it isn’t currently worth the engineering investment, the money is going into autonomous cars, and those aren’t finished, either.
So set [A] got automated in the 1970s. Set [B] gets automated slowly but only where the demand is extremely high for a product using this method, and where the cost of the automation is less than paying thousands of chinese factory workers instead. (they have gotten more expensive). Set [C] is all done by humans, but over time small tricks have reduced how many humans are required.
TFP doesn’t mean productivity per worker. It’s designed to identify economic progress which can’t be attributed to increases in labor or capital intensification aka technological progress applied to make an economy more efficient. Advances in automation should be captured under such a measurement.
You are saying “improvements in output not accomplished by spending more real dollars in equipment or having more people working”.
Hypothetically if we had sentient robots tommorow they would initially be priced extremely high, where the TCO over time of such a system is only slightly less than a worker. Are you positive your metric would correctly account for such a change? This would be a revolutionary improvement that would eventually change everything but in year 1 the new sentient robots are just doing existing jobs with less labor and very high capital costs
No it wouldn’t. TFP is in a sense, a lagging indicator. It captures economic benefits of technological progress but does not evaluate emerging technologies which have yet to make an economic imprint. That said, no AI I’m aware of that presently exists is remotely comparable to a human level AI. Level 5 self driving doesn’t even exist yet and once the computational power used to power AI catches up with Moore’s Law, the field seems due for a slowdown.
Also I don’t have any reason to accept “bankunderground” as a credible and accurate measurement of progress. Especially as I can just “look out the window” and see that somehow this means all of China’s absurd growth in per citizen productivity just...doesn’t show up in the data. Huh.
The data pertains to Britain, not the developing world and the data comes from The Bank of England. Obviously, tremendous economic progress has been made globally since the 70s. But that economic progress is mostly “catching-up” aka developing countries adopting existing technologies. Far less development has happened on the frontier.
Oh. Well focusing on just britain is meaningless. Why not focus on cuba. Or one household down your block. Point is a “small” country can easily do mediocre for any number of reasons while absurd runaway progress can happen elsewhere.
The same trend can be found in every country which was developed by the 70s. Britain is simply a particularly good example because of the amount of record keeping they performed in the 18th and 19th centuries compared to other countries.
However, just looking at data from the 20th century onward, accessible for any developed country, growth at the technological frontier has slowed tremendously since the 70s. Jason Crawford already aggregated a lot of data pertaining to the slowdown here. Long story short, not much technological progress has been made outside of computing in decades.
I thought about this problem a bit more, and let’s drop speculation about what may or may not be possible in the future. And just talk about specific professions over the last 50 years.
Primary schoolteacher. That person has to give a lesson, in front of a limited number of students as children need personalized attention. So you need 1 teacher per about ~20 students, give or take 10, and that’s their full time job. Computers make it where the teacher can fill out their paperwork more easily, but there is more of it, so it’s about the same. Maybe they do slightly better a job than in the 1970s but the productivity is the same.
Janitor. You have a mop, a broom, brushes, a cart full of supplies. For large open areas there are electric floor washing machines of various types. Pretty sure all of this was available by 70s. Only improvement I know of is that companies don’t have their own staff to do it, they outsource, and the outsourcing firms may get slightly more labor per work week out of their workers.
Retail clerk. Once the bar code and register able to scan the code, look up the price, and add it to a total was readily available, late 70s at the latest, that’s about the limit. A clerk has to scan each item, and credit cards take about the same time as cash.
restaurant waitstaff, cook—totally unchanged
accountant—computers have automated huge swaths of it, but companies are far more complex than they were.
Anyways we can go down the list and find a long list of jobs that have changed minimally if at all. And therefore the productivity per worker cannot be expected to improve if the amount of human labor needed hasn’t shrunk. Some tasks, like nuclear plant worker, over time they have become less productive as more of them are needed per megawatt of power, due to more and more long tail risks being discovered.
And then you can talk about how you might get a meaningful increase in productivity from each of these roles. And, well, it’s all coming up AI. I know of no other way. You must build an automated system able to perform most of the task. Some (like schoolteacher) are nearly impossible, others like janitor are surprisingly hard, and some are being automated as we speak. (Amazon Go for retail clerks)
It’s not just about discovering more tail risks but about having a different culture on risk in those companies. One example someone from the industry gave me is that they tell their workers in yearly seminars about how to avoid cutting themselves with paper.
Right. So this is one of those anti-progress patterns I see around. What happens internally to the company is that over the Very Serious People create some Very Serious Internal Processes like There Shall be Risk Management Training (on the prevention of papercuts). And anyone suggesting that maybe they could run the company more efficiently by skipping this training has to argue either (1) elders in the company were wrong to institute such training or (2) they (personally) are pro risk.
Hard to be “pro risk” in the same way if you spoke against, say, diversity quotas by definition you are for discrimination.
So over time the company adds more and more cruft—while not really deleting much—making it less and less economically efficient. This is why big rich companies have to constantly be buying startups for the technology, because they are unable to get their own engineers to develop the same tech (because those engineers are too busy/beat down by mandatory training). And why eventually most big rich companies fail, and their assets and technology get bought up by younger companies who, when the merger goes well, basically throw out the legacy trash. (except when the opposite happens, like in the boeing mcdonnell douglas merger)
Ordering at MacDonalds is very different then it was in the past. You can now both order and pay digitally.
For cooks Googling finds https://magazine.rca.asn.au/kitchen-innovations/ . According to it there are various innovations in commencial kitchens like induction cooking.
Sure. Point is that this lets you go from 10 workers in a restaurant to 9.5 or other small increments. It’s not like the innovation of the tractor and fertilizer and other innovations, which have reduced farmers from 50% of the population (1900) to 2% (today).
To get this with restaurants the only way is intelligent robotics, at least the only way I can see. Other than just “everyone stops eating restaurant food and starts eating homogenous soylent packets.” We could automate that fully with today’s tech. Where today a restaurant with 10 workers gets replaced with 0.4 workers, who work offsite, and respond to elevated customer servers calls and elevated maintenance issues. (‘elevated’ means the autonomy tried and failed to solve the issue already. While automated maintenance isn’t too common, Amazon is experimenting with automated customer service, where in my experience a bot will basically just give a refund if you have any complaint at all about an order)
Ok, so first you aren’t talking about progress really, you are linking data on productivity per worker. Which has gone up over the decades but at a slower pace. Why is that?
Well, the simplest theory is that suppose there is a class of tasks that are easy to automate, a second class that is hard but feasible to automate with simple computers, and a set of modestly complex tasks with hundreds of thousands of edge cases.
Well, today, almost none of the improvements in AI you have read about are being used where it counts, in factories and warehouses and mines and to control trucks. This is for several reasons, the biggest one being that for a “small” niche market it isn’t currently worth the engineering investment, the money is going into autonomous cars, and those aren’t finished, either.
So set [A] got automated in the 1970s. Set [B] gets automated slowly but only where the demand is extremely high for a product using this method, and where the cost of the automation is less than paying thousands of chinese factory workers instead. (they have gotten more expensive). Set [C] is all done by humans, but over time small tricks have reduced how many humans are required.
So that would explain the observation.
TFP doesn’t mean productivity per worker. It’s designed to identify economic progress which can’t be attributed to increases in labor or capital intensification aka technological progress applied to make an economy more efficient. Advances in automation should be captured under such a measurement.
You are saying “improvements in output not accomplished by spending more real dollars in equipment or having more people working”.
Hypothetically if we had sentient robots tommorow they would initially be priced extremely high, where the TCO over time of such a system is only slightly less than a worker. Are you positive your metric would correctly account for such a change? This would be a revolutionary improvement that would eventually change everything but in year 1 the new sentient robots are just doing existing jobs with less labor and very high capital costs
No it wouldn’t. TFP is in a sense, a lagging indicator. It captures economic benefits of technological progress but does not evaluate emerging technologies which have yet to make an economic imprint. That said, no AI I’m aware of that presently exists is remotely comparable to a human level AI. Level 5 self driving doesn’t even exist yet and once the computational power used to power AI catches up with Moore’s Law, the field seems due for a slowdown.