I have a major problem with the framing of your question.
Say we invented construction robots that given a blueprint for a building and an auto generated list of materials deliveries, take the materials off trucks and assemble the building. This means you no longer need ‘shift bosses’, the computers do that. You are essentially down to 5 main roles:
The architects of the building
The site foreman (who oversees construction)
lawyers to deal with getting permits and to sue the local jurisdiction when it denies them in violation of the local jurisdiction’s or state laws (which can be automated)
inspectors
Financing people
You may notice that the people replaced, welders and tradesmen and crane operators etc, are less skilled than the remaining people. (not claiming blue collar work is unskilled but the time to learn to do it ‘ok’ enough to work independently is a few months of on the job training, with some skill gain over the years since)
This would be true for software also. The remaining people required have to have more skills. The idea of a “pointy haired boss” with no understanding of software designing a whole app that works to production scale reliability is false.
You are imagining a scenario in which computer programmers are completely automated away, rather than one where the intellectual ceiling for becoming a computer programmer is reduced and thus more people migrate to software engineering from other jobs. I don’t find your scenario as plausible as my scenario but I suppose it could happen.
I am saying those less skilled people aren’t adding value because the remaining tasks are the hardest ones llms can’t do.
It’s all architecture and dealing with coupling and extremely difficult debugging where the error messages lie to you and Google doesn’t have anything on it.
So no, unskilled people won’t migrate in.
I mean did unskilled people flood into farming when tractors were invented, or are the remaining tasks (maintaining and operating heavy equipment and planning farm interventions) more skilled?
I think you are underestimating the level of exception handling required to completely automate the average software engineers job, as happened to unskilled farmhands and factory workers. A slightly atypical few hours for a software engineers at the moment, as an example, might be discovering the logging facility stopped working on an important VM, SSHing in and figuring out what went wrong, and then applying a patch to another related piece of software to fix the bug. LLMs could help coach regular people through that process over the shoulder like a senior engineer, but they couldn’t automate the whole process, not because the individual pieces are too intellectually difficult but because it requires too much diverse and unsupervised tool use and investigation. If some AI successor to LLMs could be trusted to do that in the next few years, then we probably only have a short while until something FOOMs.
This is arguing against your point from your last reply above this. You said “more people migrate to software engineering from other jobs”. Your above reply contradicts that.
Hm, did I? I think if an over-the-shoulder senior engineer becomes a rounding error in terms of expenses then the solution is in fact to hire three times more engineers and pay them three times less. What do you think the implications of what I said are?
Because anything the AI cannot figure out on it’s own from the error, or logging in and requesting logs then opening them up (which can be trivially added to current gen AI), is not something a “junior” human engineer is likely to figure out.
Like other industries all the other times this happened, I instead expect 1⁄3 the number of engineers (for a given quantity of software) paid 3 times as much.
And because what you actually just described is from faulty architecture. A big reason why current systems are often so hard to debug and so “exception filled” is because they have trash designs. As in, they are so bad that a competent architect could trivially create a better one, but it costs so much money to rebuild a software product from scratch that the architecture becomes locked in, the technical debt permanent.
This all vanishes if AI “senior engineers” can churn out all new code to satisfy a new design, satisfying product level tests, in a few months.
I have a major problem with the framing of your question.
Say we invented construction robots that given a blueprint for a building and an auto generated list of materials deliveries, take the materials off trucks and assemble the building. This means you no longer need ‘shift bosses’, the computers do that. You are essentially down to 5 main roles:
The architects of the building
The site foreman (who oversees construction)
lawyers to deal with getting permits and to sue the local jurisdiction when it denies them in violation of the local jurisdiction’s or state laws (which can be automated)
inspectors
Financing people
You may notice that the people replaced, welders and tradesmen and crane operators etc, are less skilled than the remaining people. (not claiming blue collar work is unskilled but the time to learn to do it ‘ok’ enough to work independently is a few months of on the job training, with some skill gain over the years since)
This would be true for software also. The remaining people required have to have more skills. The idea of a “pointy haired boss” with no understanding of software designing a whole app that works to production scale reliability is false.
You are imagining a scenario in which computer programmers are completely automated away, rather than one where the intellectual ceiling for becoming a computer programmer is reduced and thus more people migrate to software engineering from other jobs. I don’t find your scenario as plausible as my scenario but I suppose it could happen.
I am saying those less skilled people aren’t adding value because the remaining tasks are the hardest ones llms can’t do.
It’s all architecture and dealing with coupling and extremely difficult debugging where the error messages lie to you and Google doesn’t have anything on it.
So no, unskilled people won’t migrate in.
I mean did unskilled people flood into farming when tractors were invented, or are the remaining tasks (maintaining and operating heavy equipment and planning farm interventions) more skilled?
I think you are underestimating the level of exception handling required to completely automate the average software engineers job, as happened to unskilled farmhands and factory workers. A slightly atypical few hours for a software engineers at the moment, as an example, might be discovering the logging facility stopped working on an important VM, SSHing in and figuring out what went wrong, and then applying a patch to another related piece of software to fix the bug. LLMs could help coach regular people through that process over the shoulder like a senior engineer, but they couldn’t automate the whole process, not because the individual pieces are too intellectually difficult but because it requires too much diverse and unsupervised tool use and investigation. If some AI successor to LLMs could be trusted to do that in the next few years, then we probably only have a short while until something FOOMs.
This is arguing against your point from your last reply above this. You said “more people migrate to software engineering from other jobs”. Your above reply contradicts that.
Hm, did I? I think if an over-the-shoulder senior engineer becomes a rounding error in terms of expenses then the solution is in fact to hire three times more engineers and pay them three times less. What do you think the implications of what I said are?
Because anything the AI cannot figure out on it’s own from the error, or logging in and requesting logs then opening them up (which can be trivially added to current gen AI), is not something a “junior” human engineer is likely to figure out.
Like other industries all the other times this happened, I instead expect 1⁄3 the number of engineers (for a given quantity of software) paid 3 times as much.
And because what you actually just described is from faulty architecture. A big reason why current systems are often so hard to debug and so “exception filled” is because they have trash designs. As in, they are so bad that a competent architect could trivially create a better one, but it costs so much money to rebuild a software product from scratch that the architecture becomes locked in, the technical debt permanent.
This all vanishes if AI “senior engineers” can churn out all new code to satisfy a new design, satisfying product level tests, in a few months.