I’m not sure I’m following. Janitors are also great; nobody would really want to step foot in a business or storefront if it had trash everywhere. Without a janitor you would lose most if not all of your business quite fast. Yet janitorial work is low paid due to the high supply.
Most such roles can be said to have a high impact on a company. It is easy to see how isolating any role in a company you could hypothesize that they should be paid 10x what they are since without their role the company would be in ruins. Unfortunately this is not accurate to reality.
To my understanding, that is the point of the argument being made: why are programmers paid so highly when there are so few barriers to becoming a programmer, meaning that the supply of programmers should be higher than it is? If programmers are so amazing and high achieving then there should be many people lining up to become one (as the argument theorizes this is easy).
If a janitor quits, a new janitor can be hired the next day with minimal disruption. If a programmer quits, it will be half a year before a newly hired replacement can have acquired the context, they may bring expertise about your business to a competitor, and there’s a significant risk that the replacement hire will be bad. Projects and businesses do sometimes fail because their programmers quit. This means that even if there were an oversupply of programmers, it would still be worth paying them well in order to increase retention.
How? Is the model that, as the field matures, programmers will get more fungible? Because it actually seems like programmers have gotten less fungible over time (as both projects and tech stacks have increased in size) rather than more.
Seems to me that there is pressure on developers to become “full-stack developers” and “dev-ops”, which would make them more fungible. But there are also other forces working in the opposite direction, which seem to be stronger at the moment.
My model is that over time systems get more similar between companies, as we start learning the best way to do things and get good open source infrastructure for the common things.
But you may be right: there’s a really strong tendency to build layers on top of layers, which means, for example, “familiarity with the Google Ads stack” is very important to the company and not a very transferrable skill.
I’m not sure I’m following. Janitors are also great; nobody would really want to step foot in a business or storefront if it had trash everywhere. Without a janitor you would lose most if not all of your business quite fast. Yet janitorial work is low paid due to the high supply.
Most such roles can be said to have a high impact on a company. It is easy to see how isolating any role in a company you could hypothesize that they should be paid 10x what they are since without their role the company would be in ruins. Unfortunately this is not accurate to reality.
To my understanding, that is the point of the argument being made: why are programmers paid so highly when there are so few barriers to becoming a programmer, meaning that the supply of programmers should be higher than it is? If programmers are so amazing and high achieving then there should be many people lining up to become one (as the argument theorizes this is easy).
If a janitor quits, a new janitor can be hired the next day with minimal disruption. If a programmer quits, it will be half a year before a newly hired replacement can have acquired the context, they may bring expertise about your business to a competitor, and there’s a significant risk that the replacement hire will be bad. Projects and businesses do sometimes fail because their programmers quit. This means that even if there were an oversupply of programmers, it would still be worth paying them well in order to increase retention.
I agree, though these are factors that are exacerbated by the field being so new.
How? Is the model that, as the field matures, programmers will get more fungible? Because it actually seems like programmers have gotten less fungible over time (as both projects and tech stacks have increased in size) rather than more.
Seems to me that there is pressure on developers to become “full-stack developers” and “dev-ops”, which would make them more fungible. But there are also other forces working in the opposite direction, which seem to be stronger at the moment.
My model is that over time systems get more similar between companies, as we start learning the best way to do things and get good open source infrastructure for the common things.
But you may be right: there’s a really strong tendency to build layers on top of layers, which means, for example, “familiarity with the Google Ads stack” is very important to the company and not a very transferrable skill.