it’s becoming an expected skill, and there are increasingly more avenues for learning programming at increasingly younger ages … You have a recipe for an over-saturated market
This assumes that enough people can learn to program well. Just because they are expected to learn and have many different textbooks and learning websites available, doesn’t mean that enough of them will succeed. Maybe only some fraction of population is able to master the necessary skills. Maybe we are already using a significant part of this fraction, so we get diminishing returns on trying to make more people IT-skilled.
The field of IT keeps growing, both in scope and in complexity. Twenty years ago, making a static HTML page was a good way to make tons of money; these days everyone wants interaction and database and whatever. Twenty years ago many people didn’t know internet even existed; some of them are willing to pay for a website now. Maybe ten or twenty years later they will pay you to create a better algorithm for their vacuum cleaner or refrigerator. Smartphones opened a new platform for making programs; another hardware may open another space tomorrow.
Thirty years ago, when you turned on the computer, you were invited by a command line. You had to type a command, to do anything. The inferential distance from typing commands to creating simple programs was extremely short. Also, every computer supported some kind of programming language (e.g. Basic) out of the box. You didn’t have to install anything, you had the programming language ready, and it was the same language and the same version as your neighbors had, assuming you had the same kind of the computer. With ownership of computers, programming came relatively easily. These days, the gap between using your computer (clicking on icons, various mouse operations, multimedia support, etc) and programming (typing text) is greater, and the transition is less natural. Beginning programmers these days have a large inferential distance to cross.
I am not going to predict which direction the market pay for programmers will go; I just wanted to provide an evidence for the opposite direction. In some aspects, the path to programming is becoming easier (cheaper computers, good free lessons, internet and google and open source), in other aspects it is becoming harder (more distractions, more complex technologies, greater customer expectations).
Do you have an example of another industry that was high paying, well respected, and cheap to learn, that DIDN’T decline in pay and opportunities? If so, that would allow me to give more credence to your arguments.
In my career coaching work, one of the things I try to teach is how to spot these patterns of which way a market is going. This has some classic signs, and I can give plenty of examples of other industries in which this same pattern took place.
In such case, I guess you are more likely to be correct about this than me.
Only the “cheap to learn” part feels wrong to me. I mean, the financial costs of learning programming are already literally zero in the recent years, and somehow still most people don’t learn one of the highest paying professions. Why? If they didn’t do it during the recent five years, why should they do it during the next twenty? Maybe an ability is the problem, not the financial costs of learning.
I suspect that those other industries either employed less people than IT, or that they were easier to learn. On the other hand, IT has its own specific risk—a possibility to work remotely, so it is easier to outsource.
somehow still most people don’t learn one of the highest paying professions. Why?
Because they can’t. Go talk to someone from the lower half of the IQ distribution, see if they strike you as someone whose attempts to code will not result in a disaster.
Learning to program “Hello, world” is easy. Learning to write good (clear, concise, maintainable, elegant, effective, bug-hostile) code is pretty hard.
In my career coaching work, one of the things I try to teach is how to spot these patterns of which way a market is going. This has some classic signs, and I can give plenty of examples of other industries in which this same pattern took place.
Examples would be appreciated. But this seems to be a case of trying to time the market and the usual objection applies; if you can time the market to within a year you can make huge piles of money. One of the contributors on HN, lsc of prgrmr.com talks about how he was calling the property bubble in the Bay area for years before it popped, and how if he had just got in at the frothy height of the dotcom bubble like everyone else, he’d still be ahead now on property, very far ahead.
As maia said, it’s not really about trying to time the market down to the year (or even trying to pinpoint it within 5 years)… but rather picking up on trends and trying to invest your time in the right places.
I started teaching the basic concepts after working with so many clients who had painted themselves into a corner by creating a great career in a dying industry. Some examples of industries for which the writing was on the wall:
-Print Journalism
-Almost any US manufacturing job, especially textiles.
-Projectionists
I suppose those are all jobs/industries which declined due to technology, although the example of technical manufacturing of computer hardware shares many similarities to programming/coding jobs today.
Here are a few jobs which have declined due to commoditization(is that a word?) of the knowledge:
-Typists
-Data entry specialists
-Computer Operators
Those are just the ones that have recently (past 15 years) continued to decline as skills have gone from specialist to commodity. If you go further back, you’ll find similar examples for most new technologies that initially have high pay for specialists who operate it, but which becomes very cheap to learn.
And here are some of the jobs and industries which, if my clients insist on taking them, I recommend they leverage to another job title or industry as soon as possible:
-Social Media/Community Manager
-Programmer
-Anything print journalism
I suspect that predicting trends in the pay for a certain career path doesn’t need to be that precise in order to be useful. If you can predict the year in which it’ll happen, you make huge piles of money. If you can predict the decade in which it’ll happen, maybe you can’t do that as well, but you could still make a choice to do something else.
Do you have an example of another industry that was high paying, well respected, and cheap to learn, that DIDN’T decline in pay and opportunities?
It’s cheap to learn if you are intelligent and are already good at abstract thinking.
A lot of Indian programmers get payed quite poorly because they don’t have the hacker mindset. Teaching the hacker mindset is not straightforward because there often a lot of culture in the way.
There are Indians who manage to learn to become good programmers but most people who learn programming at an Indian university don’t.
This assumes that enough people can learn to program well. Just because they are expected to learn and have many different textbooks and learning websites available, doesn’t mean that enough of them will succeed. Maybe only some fraction of population is able to master the necessary skills. Maybe we are already using a significant part of this fraction, so we get diminishing returns on trying to make more people IT-skilled.
The field of IT keeps growing, both in scope and in complexity. Twenty years ago, making a static HTML page was a good way to make tons of money; these days everyone wants interaction and database and whatever. Twenty years ago many people didn’t know internet even existed; some of them are willing to pay for a website now. Maybe ten or twenty years later they will pay you to create a better algorithm for their vacuum cleaner or refrigerator. Smartphones opened a new platform for making programs; another hardware may open another space tomorrow.
Thirty years ago, when you turned on the computer, you were invited by a command line. You had to type a command, to do anything. The inferential distance from typing commands to creating simple programs was extremely short. Also, every computer supported some kind of programming language (e.g. Basic) out of the box. You didn’t have to install anything, you had the programming language ready, and it was the same language and the same version as your neighbors had, assuming you had the same kind of the computer. With ownership of computers, programming came relatively easily. These days, the gap between using your computer (clicking on icons, various mouse operations, multimedia support, etc) and programming (typing text) is greater, and the transition is less natural. Beginning programmers these days have a large inferential distance to cross.
I am not going to predict which direction the market pay for programmers will go; I just wanted to provide an evidence for the opposite direction. In some aspects, the path to programming is becoming easier (cheaper computers, good free lessons, internet and google and open source), in other aspects it is becoming harder (more distractions, more complex technologies, greater customer expectations).
Do you have an example of another industry that was high paying, well respected, and cheap to learn, that DIDN’T decline in pay and opportunities? If so, that would allow me to give more credence to your arguments.
In my career coaching work, one of the things I try to teach is how to spot these patterns of which way a market is going. This has some classic signs, and I can give plenty of examples of other industries in which this same pattern took place.
In such case, I guess you are more likely to be correct about this than me.
Only the “cheap to learn” part feels wrong to me. I mean, the financial costs of learning programming are already literally zero in the recent years, and somehow still most people don’t learn one of the highest paying professions. Why? If they didn’t do it during the recent five years, why should they do it during the next twenty? Maybe an ability is the problem, not the financial costs of learning.
I suspect that those other industries either employed less people than IT, or that they were easier to learn. On the other hand, IT has its own specific risk—a possibility to work remotely, so it is easier to outsource.
Because they can’t. Go talk to someone from the lower half of the IQ distribution, see if they strike you as someone whose attempts to code will not result in a disaster.
Learning to program “Hello, world” is easy. Learning to write good (clear, concise, maintainable, elegant, effective, bug-hostile) code is pretty hard.
Examples would be appreciated. But this seems to be a case of trying to time the market and the usual objection applies; if you can time the market to within a year you can make huge piles of money. One of the contributors on HN, lsc of prgrmr.com talks about how he was calling the property bubble in the Bay area for years before it popped, and how if he had just got in at the frothy height of the dotcom bubble like everyone else, he’d still be ahead now on property, very far ahead.
As maia said, it’s not really about trying to time the market down to the year (or even trying to pinpoint it within 5 years)… but rather picking up on trends and trying to invest your time in the right places.
I started teaching the basic concepts after working with so many clients who had painted themselves into a corner by creating a great career in a dying industry. Some examples of industries for which the writing was on the wall: -Print Journalism -Almost any US manufacturing job, especially textiles. -Projectionists
I suppose those are all jobs/industries which declined due to technology, although the example of technical manufacturing of computer hardware shares many similarities to programming/coding jobs today.
Here are a few jobs which have declined due to commoditization(is that a word?) of the knowledge: -Typists -Data entry specialists -Computer Operators
Those are just the ones that have recently (past 15 years) continued to decline as skills have gone from specialist to commodity. If you go further back, you’ll find similar examples for most new technologies that initially have high pay for specialists who operate it, but which becomes very cheap to learn.
And here are some of the jobs and industries which, if my clients insist on taking them, I recommend they leverage to another job title or industry as soon as possible: -Social Media/Community Manager -Programmer -Anything print journalism
I suspect that predicting trends in the pay for a certain career path doesn’t need to be that precise in order to be useful. If you can predict the year in which it’ll happen, you make huge piles of money. If you can predict the decade in which it’ll happen, maybe you can’t do that as well, but you could still make a choice to do something else.
It’s cheap to learn if you are intelligent and are already good at abstract thinking.
A lot of Indian programmers get payed quite poorly because they don’t have the hacker mindset. Teaching the hacker mindset is not straightforward because there often a lot of culture in the way.
There are Indians who manage to learn to become good programmers but most people who learn programming at an Indian university don’t.