Most important is the historical evidence of highly innovative ‘monopolies,’ like Carnegie Steel, Standard Oil, AT&T, and Microsoft.
I don’t know about Carnegie Steel and Standard Oil, but calling AT&T and Microsoft “innovative monopolies” is a bit of a misnomer in my book. These companies were innovative. Their innovation allowed them to become monopolies. After they reached their monopoly position, they largely stopped innovating, outside of making incremental improvements to their core products.
In the case of AT&T, once it had its monopoly position, what innovations did it bring to the consumer? Sure, the transistor was invented at Bell Labs, but it was actually commercialized by Shockley Semiconductor, and later by Fairchild Semiconductor. Likewise, Bell Labs also invented Unix, but it was actually commercialized by companies like SCO, HP, Sun, and SGI.
Microsoft, in turn, made significant strides in consumer desktop operating systems, but once it had its monopoly, it stagnated. Sure, MSR came up with innovation after innovation, from practical microkernels to a predecessor to the iPad, but were those innovations ever brought to market? No. They were dismissed as research projects and were written off as the cost of finding “real” innovations that would incrementally improve the core desktop/server business. Meanwhile, Microsoft worked hard to use its dominant monopoly and patent positions to suppress competition from other operating systems and alternate modes of computing.
It seems to me that in practical terms, there is no such thing as an “innovative monopoly”. Once a company reaches a monopoly position, its incentive structure is to suppress all innovation that does not improve its core business. It has moved from an “explore” model to an “exploit” model. When that exploitation is threatened, it uses regulatory means to position itself as a “gatekeeper” or “troll” (as Sarah Constantin puts it) to extract a tax in order to make competition either completely unviable or get enough incremental revenue to fund its own efforts to compete with the new entrant.
If you could teleport a modern steel company back in time, they could put Carnegie out of business;
Is that actually true? Or would Carnegie be able to use his monopoly position to simultaneously undercut the modern steel company’s prices, while using his influence with other sectors of the economy to drive up input costs, using his influence with government regulators to increase the regulatory burdens that would have to be borne by the smaller competitor, and simultaneously working to reverse-engineer the proprietary secrets of his new competitor? Either that, or (like Facebook, Google, or Apple today) he’d just straight up buy his competition, and either incorporate its innovations into his products (if they were compatible) or just suppress them (if they were incompatible). Or if he weren’t able to eat the company wholesale, he’d poach the talent, much as Google, Facebook, and Uber poach AI talent from research labs.
It’s not at all clear to me that a modern steel corporation, teleported back in time, and placed in a competitive position to Andrew Carnegie would be able to actually successfully compete with Carnegie without government intervention.
So given all this, why are there new firms at all? Why was Microsoft able to topple IBM? Why were Google and Apple able to topple Microsoft?
I think the answer lies in the original firm not knowing fully who its competitors are. IBM thought of itself as a hardware manufacturer, and specifically a mainframe hardware manufacturer. Microsoft was a PC software manufacturer. The two categories were worlds apart from IBM’s perspective.
It’s executives could not perceive the PC as a threat. Rather, they saw the PC as colonizing a new market, a market that IBM’s core business had no real interest in. Then, by the time that IBM realized that the vast PC market was going to limit demand for mainframes, it was too late. Microsoft had already established a dominant position among personal computers and it would have been cost prohibitive for IBM to compete. In fact, IBM recognized this, and attempted to “cooperate” with Microsoft on a new operating system. However, Microsoft was able to see the effort for what it was and block IBM’s attempt to regain control of the desktop PC market.
Microsoft, similarly, initially dismissed the iPhone as a joke. It did not see that the future of computing was in mobile devices, not desktop and laptop machines running Windows. Just like IBM, it focused on its core businesses of operating systems and productivity suites, and only realized too late that the core business had been walled in by the much larger mobile computing market. And again, just like IBM, by the time it realized this, it was far too late for it to establish a beachhead and compete, even though it triedrepeatedly.
In the same way, Xerox actually invented most of the technologies that led to its demise. The personal workstation, the GUI, the mouse, and many other innovations were all invented at PARC. However, Xerox executives never imagined these innovations displacing the copier, so they felt free in dismissing them as pie-in-the-sky toys to keep the researchers happy.
Google, Facebook, and Apple, so far, have not made these same mistakes. Facebook, especially, has been aggressive about acquiring competitors that could build alternate social graphs, even if their functionality is radically different than Facebook’s. Likewise, Google has recognized that its core business is to funnel ads to viewers, and has not been shy about acquiring otherfirms whose business model threatens Google’s ability to do so. Whether this aggressiveness will last depends on a whole host of contingent factors, like executive competence, regulatory oversight and short-term market pressures on the core business. In the near term, however, the new tech monopolies do appear to be qualitatively more paranoid in a way their predecessors were not.
Microsoft, similarly, initially dismissed the iPhone as a joke. It did not see that the future of computing was in mobile devices, not desktop and laptop machines running Windows.
I don’t think that’s fair. Microsoft did have a mobile OS before Apple. It’s just that they didn’t get the idea that very simply UI is better than a UI that’s accessed via a stylus along with a hardware keyboard. I think they also didn’t have the haptic feedback that you do get on iOS and Android when software buttons are pressed. It’s quite interesting that you can give user that clicks software button haptic feedback without them being aware that the phone in their hand vibrates when they press a button. There were a bunch of interesting ideas that were needed to get the concept to work.
There were a bunch of interesting ideas that were needed to get the concept to work.
Microsoft could have invented those, instead of Apple, if Microsoft invested the resources into mobile devices that Apple had. Instead, Microsoft treated the entire mobile device space as a sideshow, or an adjunct to desktop computing.
Meanwhile Apple, perhaps because of its success with the iPod, invested heavily in mobile device hardware engineering, mobile UI research, building relations with suppliers, and all the other necessary things to not just invent the iPhone, but manufacture it and sell it by the millions.
AT&T did not sell new products because it was banned from doing so from 1956, and probably effectively from earlier. But, as Vaniver says, it kept pursuing computers because it had a huge internal use for them, to make its telephone system cheaper to run, such as to replace the 300k switchboard operators.
Do you have a source for that? My understanding was that AT&T was the one doing the banning, by preventing consumers from connecting unapproved telephones (and other telecommunications equipment) to its lines.
That’s also true, but why would that have any effect on your belief about whether the government banned AT&T from selling unrelated products?
I was just quoting wikipedia, but googling “at&t 1956” brings up this:
1956: AT&T agreed to a consent decree that ends the [1949] lawsuit. The company agrees to confine its activities to common-carrier communication services and government projects. It promises to only manufacture products needed by the Bell system and agrees to make its existing patents available to anyone without charge.
In 1956, AT&T signed a consent decree with the federal government that allowed it to keep its structure under which it sold both phone service and telephones themselves. In exchange, AT&T promised to stay out of other businesses and license its patents freely. AT&T’s equipment arm, Western Electric, had to withdraw from selling sound equipment for film producers and movie theaters—giving up experience in a competitive market that would have proved useful later.
After they reached their monopoly position, they largely stopped innovating, outside of making incremental improvements to their core products.
Most innovations are incremental improvements to core products. Yes, historical firms have rarely made use of revolutionary basic innovations, even if they invented them, but this seems to primarily have to do with institutional inertia.
Is that actually true?
Carnegie’s edge over his competition was something like five or ten years; a modern firm would be about one hundred years of research into metallurgy, tool generation, and industrial organization ahead. I agree that it’s more likely that they’d attempt to consolidate, and that industrial espionage would be a risk, but overall it seems pretty easy to call.
Carnegie’s edge over his competition was something like five or ten years; a modern firm would be about one hundred years of research into metallurgy, tool generation, and industrial organization ahead.
But how much of that innovation would it be able to actually use? How are you going to use processes that rely in high-precision components when you don’t have CNC machines to manufacture them? How are you going to check for defects with ultrasound when ultrasound machines are 50 years in the future? How are you going to use processes that rely on electricity when the power generation capacity to actually run those processes doesn’t exist yet? How are you going to use your newfangled industrial organization systems when your workforce hasn’t mentally incorporated many of the social norms needed for an industrialized society? How do your engineers even communicate with e.g. machinists, when there’s been a hundred years of notational changes?
I have a great story from the petrochemical industry on this very topic. A firm is trying to optimize one of its chemical plants, built a mere thirty years ago. In theory this should be an easy job. The firm looks up its own documentation on the plant, plans the improvements, and executes them. In practice:
It was designed by a group that no longer exists, in a company that has since merged, in an office that has been closed, using non-digital methods that are no longer employed.
Transporting a modern steel mill a hundred years into the past would easily an order of magnitude more difficult than this, and I can see the difficulties therein easily wiping out the potential efficiency gains.
How are you going to use your newfangled industrial organization systems when your workforce hasn’t mentally incorporated many of the social norms needed for an industrialized society?
When I said “company,” I was also including the employees, as potentially more valuable than the physical plant. Including the supply chain also is probably unfair, but I expect they could get pretty far with converting on-hand present-day cash to supplies before being teleported back in time.
Transporting a modern steel mill a hundred years into the past would easily an order of magnitude more difficult than this, and I can see the difficulties therein easily wiping out the potential efficiency gains.
This seems possible to me; even if they have to abandon lots of tools as unsupported by their new supply chain, I imagine them getting a lot out of a hundred years of scientific development.
I was also including the employees of the company itself, and all the machinery and equipment that the company has on hand. However, today, we live in a world of just-in-time supply chains, with companies subcontracting various specialized functions out to various subcontractors. Very few companies are the sort of vertically integrated enterprises that Ford or US Steel were at the turn of the century. As a result, a lot of the knowledge that you’re attributing to the individual company is actually in the supply chain for the company, not the company itself. This applies not just to the nitty-gritty of actually building components, but even matters of overall product design. As an example, Foxconn has a large say in iPhone manufacturing, because, at the end of the day, they’re the ones putting the things together. In the auto industry, parts suppliers like Mopar or Denso often know more about the cars that Chrysler and Toyota build than Toyota or Chrysler themselves.
In addition, companies today carry much less inventory than they used to, as a result of the aforementioned JIT supply chains. So even if your modern steel corporation could be operated with 1899-era machinery and infrastructure, it’s still might not do you any good, because you’d rapidly run out of iron, coke, and the special alloying agents needed to make modern high-strength steel. Today, you can rely on computerized inventory management and logistical planning systems that are deeply integrated with your suppliers to ensure that new supplies of raw materials and spare parts reach your production facilities right when they’re needed. How are you going to do that in 1899? Hire a bunch of telegraph clerks?
Moreover, this reliance on supply chains forms another pressure point that a monopolist like Carnegie could use to suppress your firm. He could work with the railroads (which were not operating under the common carrier regulations that they operate under today) to ensure that your factories were starved of raw materials, while his factories were well supplied. No amount of scientific expertise can conjure finished steel out of thin air.
Finally, you allude to cash reserves, but even that might be a problem. Not all corporations are as cash rich as Apple or Microsoft. And even Apple and Microsoft keep large amounts of their “cash” in various short-term investments, offshore holding companies, and various tax shelters. It’s not like Apple has a giant underground vault filled with hundred dollar bills in Cupertino. So if you transported a company back in time, it might not have enough cash on hand to make payroll, much less sustain ongoing operations.
I think many people strongly underestimate how interconnected and interdependent firms in the modern economy are. Many people have an idea of how firms operate that was beginning to be obsolete in the ’80s, and is wholly obsolete today. Firms are increasingly disaggregated, as computerization has reduced transaction costs, allowing for a more Coasean equilibrium.
That said, many of these innovations have only really taken root since 1980 or so. Prior to that firms were more like the sort of vertically integrated enterprises that people envision when they think of a classic industrial corporation. Perhaps US Steel from 1968 could go back in time and compete effectively with the 1899 version of itself. But I’m not sure that the US Steel from 2018 could do the same.
Once a company reaches a monopoly position, its incentive structure is to suppress all innovation that does not improve its core business.
Once an actor reaches uncontested dominance, its incentive structure is to suppress all change that does not improve its position.
In my more paranoid moments, I suspect there’s something like this going on in general: American power actors want stagnation and fear change, because change can be destabilizing and they’re what would be destabilized. This is obviously true in the case of cultural power, but I’m not sure how it would extend beyond that.
I don’t know about Carnegie Steel and Standard Oil, but calling AT&T and Microsoft “innovative monopolies” is a bit of a misnomer in my book. These companies were innovative. Their innovation allowed them to become monopolies. After they reached their monopoly position, they largely stopped innovating, outside of making incremental improvements to their core products.
In the case of AT&T, once it had its monopoly position, what innovations did it bring to the consumer? Sure, the transistor was invented at Bell Labs, but it was actually commercialized by Shockley Semiconductor, and later by Fairchild Semiconductor. Likewise, Bell Labs also invented Unix, but it was actually commercialized by companies like SCO, HP, Sun, and SGI.
Microsoft, in turn, made significant strides in consumer desktop operating systems, but once it had its monopoly, it stagnated. Sure, MSR came up with innovation after innovation, from practical microkernels to a predecessor to the iPad, but were those innovations ever brought to market? No. They were dismissed as research projects and were written off as the cost of finding “real” innovations that would incrementally improve the core desktop/server business. Meanwhile, Microsoft worked hard to use its dominant monopoly and patent positions to suppress competition from other operating systems and alternate modes of computing.
It seems to me that in practical terms, there is no such thing as an “innovative monopoly”. Once a company reaches a monopoly position, its incentive structure is to suppress all innovation that does not improve its core business. It has moved from an “explore” model to an “exploit” model. When that exploitation is threatened, it uses regulatory means to position itself as a “gatekeeper” or “troll” (as Sarah Constantin puts it) to extract a tax in order to make competition either completely unviable or get enough incremental revenue to fund its own efforts to compete with the new entrant.
Is that actually true? Or would Carnegie be able to use his monopoly position to simultaneously undercut the modern steel company’s prices, while using his influence with other sectors of the economy to drive up input costs, using his influence with government regulators to increase the regulatory burdens that would have to be borne by the smaller competitor, and simultaneously working to reverse-engineer the proprietary secrets of his new competitor? Either that, or (like Facebook, Google, or Apple today) he’d just straight up buy his competition, and either incorporate its innovations into his products (if they were compatible) or just suppress them (if they were incompatible). Or if he weren’t able to eat the company wholesale, he’d poach the talent, much as Google, Facebook, and Uber poach AI talent from research labs.
It’s not at all clear to me that a modern steel corporation, teleported back in time, and placed in a competitive position to Andrew Carnegie would be able to actually successfully compete with Carnegie without government intervention.
So given all this, why are there new firms at all? Why was Microsoft able to topple IBM? Why were Google and Apple able to topple Microsoft?
I think the answer lies in the original firm not knowing fully who its competitors are. IBM thought of itself as a hardware manufacturer, and specifically a mainframe hardware manufacturer. Microsoft was a PC software manufacturer. The two categories were worlds apart from IBM’s perspective.
It’s executives could not perceive the PC as a threat. Rather, they saw the PC as colonizing a new market, a market that IBM’s core business had no real interest in. Then, by the time that IBM realized that the vast PC market was going to limit demand for mainframes, it was too late. Microsoft had already established a dominant position among personal computers and it would have been cost prohibitive for IBM to compete. In fact, IBM recognized this, and attempted to “cooperate” with Microsoft on a new operating system. However, Microsoft was able to see the effort for what it was and block IBM’s attempt to regain control of the desktop PC market.
Microsoft, similarly, initially dismissed the iPhone as a joke. It did not see that the future of computing was in mobile devices, not desktop and laptop machines running Windows. Just like IBM, it focused on its core businesses of operating systems and productivity suites, and only realized too late that the core business had been walled in by the much larger mobile computing market. And again, just like IBM, by the time it realized this, it was far too late for it to establish a beachhead and compete, even though it tried repeatedly.
In the same way, Xerox actually invented most of the technologies that led to its demise. The personal workstation, the GUI, the mouse, and many other innovations were all invented at PARC. However, Xerox executives never imagined these innovations displacing the copier, so they felt free in dismissing them as pie-in-the-sky toys to keep the researchers happy.
Google, Facebook, and Apple, so far, have not made these same mistakes. Facebook, especially, has been aggressive about acquiring competitors that could build alternate social graphs, even if their functionality is radically different than Facebook’s. Likewise, Google has recognized that its core business is to funnel ads to viewers, and has not been shy about acquiring other firms whose business model threatens Google’s ability to do so. Whether this aggressiveness will last depends on a whole host of contingent factors, like executive competence, regulatory oversight and short-term market pressures on the core business. In the near term, however, the new tech monopolies do appear to be qualitatively more paranoid in a way their predecessors were not.
I don’t think that’s fair. Microsoft did have a mobile OS before Apple. It’s just that they didn’t get the idea that very simply UI is better than a UI that’s accessed via a stylus along with a hardware keyboard. I think they also didn’t have the haptic feedback that you do get on iOS and Android when software buttons are pressed. It’s quite interesting that you can give user that clicks software button haptic feedback without them being aware that the phone in their hand vibrates when they press a button. There were a bunch of interesting ideas that were needed to get the concept to work.
Microsoft could have invented those, instead of Apple, if Microsoft invested the resources into mobile devices that Apple had. Instead, Microsoft treated the entire mobile device space as a sideshow, or an adjunct to desktop computing.
Meanwhile Apple, perhaps because of its success with the iPod, invested heavily in mobile device hardware engineering, mobile UI research, building relations with suppliers, and all the other necessary things to not just invent the iPhone, but manufacture it and sell it by the millions.
AT&T did not sell new products because it was banned from doing so from 1956, and probably effectively from earlier. But, as Vaniver says, it kept pursuing computers because it had a huge internal use for them, to make its telephone system cheaper to run, such as to replace the 300k switchboard operators.
Do you have a source for that? My understanding was that AT&T was the one doing the banning, by preventing consumers from connecting unapproved telephones (and other telecommunications equipment) to its lines.
That’s also true, but why would that have any effect on your belief about whether the government banned AT&T from selling unrelated products?
I was just quoting wikipedia, but googling “at&t 1956” brings up this:
or, if you don’t like pdfs, how about the wsj:
Most innovations are incremental improvements to core products. Yes, historical firms have rarely made use of revolutionary basic innovations, even if they invented them, but this seems to primarily have to do with institutional inertia.
Carnegie’s edge over his competition was something like five or ten years; a modern firm would be about one hundred years of research into metallurgy, tool generation, and industrial organization ahead. I agree that it’s more likely that they’d attempt to consolidate, and that industrial espionage would be a risk, but overall it seems pretty easy to call.
But how much of that innovation would it be able to actually use? How are you going to use processes that rely in high-precision components when you don’t have CNC machines to manufacture them? How are you going to check for defects with ultrasound when ultrasound machines are 50 years in the future? How are you going to use processes that rely on electricity when the power generation capacity to actually run those processes doesn’t exist yet? How are you going to use your newfangled industrial organization systems when your workforce hasn’t mentally incorporated many of the social norms needed for an industrialized society? How do your engineers even communicate with e.g. machinists, when there’s been a hundred years of notational changes?
I have a great story from the petrochemical industry on this very topic. A firm is trying to optimize one of its chemical plants, built a mere thirty years ago. In theory this should be an easy job. The firm looks up its own documentation on the plant, plans the improvements, and executes them. In practice:
Transporting a modern steel mill a hundred years into the past would easily an order of magnitude more difficult than this, and I can see the difficulties therein easily wiping out the potential efficiency gains.
When I said “company,” I was also including the employees, as potentially more valuable than the physical plant. Including the supply chain also is probably unfair, but I expect they could get pretty far with converting on-hand present-day cash to supplies before being teleported back in time.
This seems possible to me; even if they have to abandon lots of tools as unsupported by their new supply chain, I imagine them getting a lot out of a hundred years of scientific development.
I was also including the employees of the company itself, and all the machinery and equipment that the company has on hand. However, today, we live in a world of just-in-time supply chains, with companies subcontracting various specialized functions out to various subcontractors. Very few companies are the sort of vertically integrated enterprises that Ford or US Steel were at the turn of the century. As a result, a lot of the knowledge that you’re attributing to the individual company is actually in the supply chain for the company, not the company itself. This applies not just to the nitty-gritty of actually building components, but even matters of overall product design. As an example, Foxconn has a large say in iPhone manufacturing, because, at the end of the day, they’re the ones putting the things together. In the auto industry, parts suppliers like Mopar or Denso often know more about the cars that Chrysler and Toyota build than Toyota or Chrysler themselves.
In addition, companies today carry much less inventory than they used to, as a result of the aforementioned JIT supply chains. So even if your modern steel corporation could be operated with 1899-era machinery and infrastructure, it’s still might not do you any good, because you’d rapidly run out of iron, coke, and the special alloying agents needed to make modern high-strength steel. Today, you can rely on computerized inventory management and logistical planning systems that are deeply integrated with your suppliers to ensure that new supplies of raw materials and spare parts reach your production facilities right when they’re needed. How are you going to do that in 1899? Hire a bunch of telegraph clerks?
Moreover, this reliance on supply chains forms another pressure point that a monopolist like Carnegie could use to suppress your firm. He could work with the railroads (which were not operating under the common carrier regulations that they operate under today) to ensure that your factories were starved of raw materials, while his factories were well supplied. No amount of scientific expertise can conjure finished steel out of thin air.
Finally, you allude to cash reserves, but even that might be a problem. Not all corporations are as cash rich as Apple or Microsoft. And even Apple and Microsoft keep large amounts of their “cash” in various short-term investments, offshore holding companies, and various tax shelters. It’s not like Apple has a giant underground vault filled with hundred dollar bills in Cupertino. So if you transported a company back in time, it might not have enough cash on hand to make payroll, much less sustain ongoing operations.
I think many people strongly underestimate how interconnected and interdependent firms in the modern economy are. Many people have an idea of how firms operate that was beginning to be obsolete in the ’80s, and is wholly obsolete today. Firms are increasingly disaggregated, as computerization has reduced transaction costs, allowing for a more Coasean equilibrium.
That said, many of these innovations have only really taken root since 1980 or so. Prior to that firms were more like the sort of vertically integrated enterprises that people envision when they think of a classic industrial corporation. Perhaps US Steel from 1968 could go back in time and compete effectively with the 1899 version of itself. But I’m not sure that the US Steel from 2018 could do the same.
Once an actor reaches uncontested dominance, its incentive structure is to suppress all change that does not improve its position.
In my more paranoid moments, I suspect there’s something like this going on in general: American power actors want stagnation and fear change, because change can be destabilizing and they’re what would be destabilized. This is obviously true in the case of cultural power, but I’m not sure how it would extend beyond that.