If a janitor quits, a new janitor can be hired the next day with minimal disruption. If a programmer quits, it will be half a year before a newly hired replacement can have acquired the context, they may bring expertise about your business to a competitor, and there’s a significant risk that the replacement hire will be bad. Projects and businesses do sometimes fail because their programmers quit. This means that even if there were an oversupply of programmers, it would still be worth paying them well in order to increase retention.
How? Is the model that, as the field matures, programmers will get more fungible? Because it actually seems like programmers have gotten less fungible over time (as both projects and tech stacks have increased in size) rather than more.
Seems to me that there is pressure on developers to become “full-stack developers” and “dev-ops”, which would make them more fungible. But there are also other forces working in the opposite direction, which seem to be stronger at the moment.
My model is that over time systems get more similar between companies, as we start learning the best way to do things and get good open source infrastructure for the common things.
But you may be right: there’s a really strong tendency to build layers on top of layers, which means, for example, “familiarity with the Google Ads stack” is very important to the company and not a very transferrable skill.
If a janitor quits, a new janitor can be hired the next day with minimal disruption. If a programmer quits, it will be half a year before a newly hired replacement can have acquired the context, they may bring expertise about your business to a competitor, and there’s a significant risk that the replacement hire will be bad. Projects and businesses do sometimes fail because their programmers quit. This means that even if there were an oversupply of programmers, it would still be worth paying them well in order to increase retention.
I agree, though these are factors that are exacerbated by the field being so new.
How? Is the model that, as the field matures, programmers will get more fungible? Because it actually seems like programmers have gotten less fungible over time (as both projects and tech stacks have increased in size) rather than more.
Seems to me that there is pressure on developers to become “full-stack developers” and “dev-ops”, which would make them more fungible. But there are also other forces working in the opposite direction, which seem to be stronger at the moment.
My model is that over time systems get more similar between companies, as we start learning the best way to do things and get good open source infrastructure for the common things.
But you may be right: there’s a really strong tendency to build layers on top of layers, which means, for example, “familiarity with the Google Ads stack” is very important to the company and not a very transferrable skill.