I don’t know if I’m willing to agree with that. The main reason is complexity which is growing. We’re basically talking about people who are amateur coders, programming isn’t their main skill but they can do it. As the complexity of the environment increases, I’m not convinced their limited by definition skills can keep up.
There are at least two directions to this argument. One is that not many non-professional programmers are good programmers. The typical way this plays out is as follows: some guy learns a bit of Excel VBA and starts by writing a few simple macros. Soon he progresses to full-blown functions and pages of code. In a few months he automated a large chunk of his work and is happy. Until, that is, it turns out that there are some errors in his output. He tries to fix it and he can’t—two new errors pop up every time he claims to have killed one. Professionals are called in and they blanch in horror at the single 30-page function which works by arcane manipulations of a large number of global variables, all with three- or four-letter incomprehensible names, not to mention hardcoded cell references and specific numbers. The code is unsalvageable—it has to be trashed completely and all output produced by it re-done.
The second direction is concerned not with the complexity of the task, but the complexity of the environment. Consider, for example, the basic example of opening a file, copying a chunk of text from it, and pasting it into another file. That used to be easy (and is still easy if the files are local ASCII text files and you’re in Unix :-D). But now imagine that to open a file you need to interface with the company’s document storage system. You have to deal with security, privileges, and permissions. You have to deal with the versioning system. Maybe the file itself is not really a file in the filesystem but an entry in a document database. The chunk of text that you’re copying might turn out to be in Unicode and contain embedded objects. Etc., etc. And the APIs of all the layers that you’re dealing with are, of course, written for professional programmers who are supposed to know this stuff well...
I think you’re judging the hypothetical amateur programmer too harshly. So what if the code is ugly? Did the guy actually save time? Does his script make more errors than he would make if doing everything by hand? Is the 30-page function really necessary to achieve noticeable gains or could he still get a lot from sticking to short code snippets and thus avoiding the ugliness?
Similarly with the second example. Maybe some steps of the workflow will still have to be done manually. This wouldn’t fly in a professionally programmed system. But if someone is already being paid to do everything manually then as long as they can automate some steps, it’s still a win.
I don’t know if I’m willing to agree with that. The main reason is complexity which is growing. We’re basically talking about people who are amateur coders, programming isn’t their main skill but they can do it. As the complexity of the environment increases, I’m not convinced their limited by definition skills can keep up.
There are at least two directions to this argument. One is that not many non-professional programmers are good programmers. The typical way this plays out is as follows: some guy learns a bit of Excel VBA and starts by writing a few simple macros. Soon he progresses to full-blown functions and pages of code. In a few months he automated a large chunk of his work and is happy. Until, that is, it turns out that there are some errors in his output. He tries to fix it and he can’t—two new errors pop up every time he claims to have killed one. Professionals are called in and they blanch in horror at the single 30-page function which works by arcane manipulations of a large number of global variables, all with three- or four-letter incomprehensible names, not to mention hardcoded cell references and specific numbers. The code is unsalvageable—it has to be trashed completely and all output produced by it re-done.
The second direction is concerned not with the complexity of the task, but the complexity of the environment. Consider, for example, the basic example of opening a file, copying a chunk of text from it, and pasting it into another file. That used to be easy (and is still easy if the files are local ASCII text files and you’re in Unix :-D). But now imagine that to open a file you need to interface with the company’s document storage system. You have to deal with security, privileges, and permissions. You have to deal with the versioning system. Maybe the file itself is not really a file in the filesystem but an entry in a document database. The chunk of text that you’re copying might turn out to be in Unicode and contain embedded objects. Etc., etc. And the APIs of all the layers that you’re dealing with are, of course, written for professional programmers who are supposed to know this stuff well...
I think you’re judging the hypothetical amateur programmer too harshly. So what if the code is ugly? Did the guy actually save time? Does his script make more errors than he would make if doing everything by hand? Is the 30-page function really necessary to achieve noticeable gains or could he still get a lot from sticking to short code snippets and thus avoiding the ugliness?
Similarly with the second example. Maybe some steps of the workflow will still have to be done manually. This wouldn’t fly in a professionally programmed system. But if someone is already being paid to do everything manually then as long as they can automate some steps, it’s still a win.