As a programmer, I extensively use GPT models in my work currently. It speeds things up. I do things that are anything but easy and repeatable, but I can usually break them into simpler parts that can be written by AI much quicker than I would even review documentation.
Nevertheless, I mostly currently do research-like parts of the project and PoCs. When I sometimes work with legacy code—GPT-3 is not that helpful. Did not yet try GPT-4 for that.
What do I see for the future of my industry? Few things—but those are loose extrapolations based on GPT progress and knowledge of the programming, not something very exact:
Speeding up of programmers’ work is already here. It started with GitHub Copilot and GPT-3 even before the Chat-GPT boom. It will get more popular and faster. The consequence is the higher performance of programmers, so more tasks can be done in a shorter time so the market pressure and market gap for employees will fall. This means that earnings will either stagnate or fall.
Solutions that could replace a junior developer totally—that has enough capability to write a program or useful fragment based on business requirements without being baby-sitted by a more professional programmer—are not yet there. I suppose GPT-5 might be it. So I would guess it can get here in 1-3 years from now. Then it is likely that many programmers will lose their jobs. There still will be work for seniors (that would work with AI assistance on more subtle and complex parts of systems and also review work of AI).
Solutions that could replace any developer, DevOps, and system admin—I think the current GPT-4 is not even close, but it may be here in a few years. It isn’t something very far away. It feels like 2 or 3 GPT versions away, when they make it more capable and also connect it with other types of models (which is already being done). I would guess that scope of 3-10 years. Then we likely will observe most of the programmers losing jobs and likely will observe AI singularity. Someone will surely use AI to iterate on AI and make it refine itself.
>The consequence is the higher performance of programmers, so more tasks can be done in a shorter time so the market pressure and market gap for employees will fall. This means that earnings will either stagnate or fall.
Mostly agree with your post. Historically higher productivity has generally lead to higher total compensation but how this affects individuals during the transition period depends on the details (eg how much pent-up demand for programming is there?).
You’re not accounting for an increase in demand for software. The tools to automate “basically every job on earth” are on the horizon but they won’t deploy or architect themselves. Plenty of work remaining.
And there are larger jobs you are not even considering. How many people need to supervise and work on a self replicating factory or a nanoforge research facility or a city replacement effort?
There are these big huge immense things we could do that we had nothing even vaguely close to the labor or technical ability to even try. Just because humans are more efficient per hour worked doesn’t mean the work won’t scale up even faster.
As a programmer, I extensively use GPT models in my work currently. It speeds things up. I do things that are anything but easy and repeatable, but I can usually break them into simpler parts that can be written by AI much quicker than I would even review documentation.
Nevertheless, I mostly currently do research-like parts of the project and PoCs. When I sometimes work with legacy code—GPT-3 is not that helpful. Did not yet try GPT-4 for that.
What do I see for the future of my industry? Few things—but those are loose extrapolations based on GPT progress and knowledge of the programming, not something very exact:
Speeding up of programmers’ work is already here. It started with GitHub Copilot and GPT-3 even before the Chat-GPT boom. It will get more popular and faster. The consequence is the higher performance of programmers, so more tasks can be done in a shorter time so the market pressure and market gap for employees will fall. This means that earnings will either stagnate or fall.
Solutions that could replace a junior developer totally—that has enough capability to write a program or useful fragment based on business requirements without being baby-sitted by a more professional programmer—are not yet there. I suppose GPT-5 might be it. So I would guess it can get here in 1-3 years from now. Then it is likely that many programmers will lose their jobs. There still will be work for seniors (that would work with AI assistance on more subtle and complex parts of systems and also review work of AI).
Solutions that could replace any developer, DevOps, and system admin—I think the current GPT-4 is not even close, but it may be here in a few years. It isn’t something very far away. It feels like 2 or 3 GPT versions away, when they make it more capable and also connect it with other types of models (which is already being done). I would guess that scope of 3-10 years. Then we likely will observe most of the programmers losing jobs and likely will observe AI singularity. Someone will surely use AI to iterate on AI and make it refine itself.
Curious for an update now that we have slight-better modals. In my brain-dead webdev use-cases, Claude 3.5 has passed some threshold of usability.
What about 3.5 pushes it over the threshold to you that was missing in previous models?
>The consequence is the higher performance of programmers, so more tasks can be done in a shorter time so the market pressure and market gap for employees will fall. This means that earnings will either stagnate or fall.
Mostly agree with your post. Historically higher productivity has generally lead to higher total compensation but how this affects individuals during the transition period depends on the details (eg how much pent-up demand for programming is there?).
You’re not accounting for an increase in demand for software. The tools to automate “basically every job on earth” are on the horizon but they won’t deploy or architect themselves. Plenty of work remaining.
And there are larger jobs you are not even considering. How many people need to supervise and work on a self replicating factory or a nanoforge research facility or a city replacement effort?
There are these big huge immense things we could do that we had nothing even vaguely close to the labor or technical ability to even try. Just because humans are more efficient per hour worked doesn’t mean the work won’t scale up even faster.