So far as I can tell, the real issue in telling someone that the only important thing is quality, is that it leads to a phenomenon known in some circles as “paralysis by analysis.” For instance, a writer could spend a day debating whether or not a particular place needed a comma or not, and miss that the whole page is rubbish. In sports, it is often what is meant when someone is accused of “thinking too much.” In football, a receiver might spend his time thinking about how to get around a defender once he has the ball, and forget to catch the ball.
Like Jeff Atwood, I am a programmer. Unlike Jeff Atwood, I do not have a Wikipedia entry -rightfully so. Also, unlike Jeff, I’m pretty new: unseasoned. So, unlike Jeff Atwood, I still remember the process of learning how to be a programmer.
As far as I can tell, this entry fits with my experiences so far in improving myself as a programmer. I didn’t get better at it by theorizing about how to make a beautiful program; in fact, when I tried, I found out the basic truth every good programmer knows; “If you’re just barely smart enough to write it, you are, by definition, not smart enough to debug it.” I spent weeks thinking about it, getting nowhere, before I used a brute force technique to fix the trouble spot within hours, and still ended up with a pretty nice program.
I must take issue with what Jeff Atwood wrote though. The vast majority of time in a nontrivial program is spent thinking, whether beforehand, or while you’re trying to parse the kludge that steadfastly refuses to work. The kludge, insoluble mass that it is, can be immensely harder to fix than replace, but the natural mechanism is always to fix. It isn’t immediately obvious, but the solution you have written has an immense hold on your mind, especially since the actual act of entering it took considerable time and effort, so some due diligence in the initial decision is highly warranted.
Many programmers love iteration, which would be analogically described as follows. Take a lump of clay that is to be a statute of a man. Make it the size of the man. First iteration complete. Form arms and legs and a head roughly where they go. Iteration two complete. Carefully delineate the broad details, such as knees, elbows wrists necks, ankles, shape of torso. Iteration three complete. Make clear the placement of medium details, such as fingers, toes, ears eyes, mouth, nose. Iteration four complete. Add the fine details, roughly. Delineate joints, add nails, add scars, add hair. Iteration five complete. Make the joints , and nails, and scars, and hair, and all the other little details just about right. Iteration six complete. Decide what to improve and do so. Iteration seven complete. Check for errors. Fix them. Repeat until done. Iteration eight complete.
The analogy is actually pretty close. Only problem? Each iteration listed above could, and usually would, actually involve several iterations, and testing steps.
The other major solution is far more straightforward. Make the clay look like a man. Step one complete. Is it correct? If no, repeat. Done.
The second way has a much greater quantity of results, because it is a simpler, and quicker, way to make a figure of a man. The superiority of the iterative approach comes in when it is clear you will not get it right in any one try. If I make no mistakes, I may take 24 steps in an iterative approach where I would take 2 in the one go approach. The steps are, however, not equal in length (the iterative steps are much shorter). Let us make it 8 to 2.
This still looks like a slam dunk for the all at once approach. With errors, it stays that way with a low quality standard. If any of the first three are right for it, all at once is still better. Once an extremely high quality standard for the end result is used, however, it is quite likely that not one of the first 25 all at once clay men will be good enough. Even with a high standard, iteration is likely to not need an amplification of more than 2 or 3. 50 versus 24 now makes it a slam dunk in favor of iteration.
In the end, both methods actually would give about the same amount of experience, (though I don’t have space to justify that here, and good arguments against it could be made) almost regardless of the number of steps necessary. Somewhere in the middle must be a crossover point, between iteration and doing it all at once. It behooves us to figure out where we are in relation to it.
The long-winded point here is that quantity (iteration) can produce high quality quicker than quality (get it right straight up), but only some of the time. A low quality standard is analogous to your low cost of failure, whereas the high cost of failure is likewise to the high quality standard. For beginners, the standard is low, and just doing it is probably the best way (though they still need to understand in order to make a real attempt).
For pure personal learning, among the experienced, it is far trickier to be sure. True failure is highly costly, as you learn the wrong thing, but it also is less likely, and minor failures can be learned from.
I’m relatively sure Jeff Atwood understands all of this, of course, but it isn’t immediately obvious from his writing here. I’m not some guru, and this isn’t diving wisdom wither, but it is always a good idea to keep in mind what is so basic, that the expert has forgotten to mention it exists. After all, he is here to push for the deviation, not simply the status quo.
So far as I can tell, the real issue in telling someone that the only important thing is quality, is that it leads to a phenomenon known in some circles as “paralysis by analysis.” For instance, a writer could spend a day debating whether or not a particular place needed a comma or not, and miss that the whole page is rubbish. In sports, it is often what is meant when someone is accused of “thinking too much.” In football, a receiver might spend his time thinking about how to get around a defender once he has the ball, and forget to catch the ball.
Like Jeff Atwood, I am a programmer. Unlike Jeff Atwood, I do not have a Wikipedia entry -rightfully so. Also, unlike Jeff, I’m pretty new: unseasoned. So, unlike Jeff Atwood, I still remember the process of learning how to be a programmer.
As far as I can tell, this entry fits with my experiences so far in improving myself as a programmer. I didn’t get better at it by theorizing about how to make a beautiful program; in fact, when I tried, I found out the basic truth every good programmer knows; “If you’re just barely smart enough to write it, you are, by definition, not smart enough to debug it.” I spent weeks thinking about it, getting nowhere, before I used a brute force technique to fix the trouble spot within hours, and still ended up with a pretty nice program.
I must take issue with what Jeff Atwood wrote though. The vast majority of time in a nontrivial program is spent thinking, whether beforehand, or while you’re trying to parse the kludge that steadfastly refuses to work. The kludge, insoluble mass that it is, can be immensely harder to fix than replace, but the natural mechanism is always to fix. It isn’t immediately obvious, but the solution you have written has an immense hold on your mind, especially since the actual act of entering it took considerable time and effort, so some due diligence in the initial decision is highly warranted.
Many programmers love iteration, which would be analogically described as follows. Take a lump of clay that is to be a statute of a man. Make it the size of the man. First iteration complete. Form arms and legs and a head roughly where they go. Iteration two complete. Carefully delineate the broad details, such as knees, elbows wrists necks, ankles, shape of torso. Iteration three complete. Make clear the placement of medium details, such as fingers, toes, ears eyes, mouth, nose. Iteration four complete. Add the fine details, roughly. Delineate joints, add nails, add scars, add hair. Iteration five complete. Make the joints , and nails, and scars, and hair, and all the other little details just about right. Iteration six complete. Decide what to improve and do so. Iteration seven complete. Check for errors. Fix them. Repeat until done. Iteration eight complete.
The analogy is actually pretty close. Only problem? Each iteration listed above could, and usually would, actually involve several iterations, and testing steps.
The other major solution is far more straightforward. Make the clay look like a man. Step one complete. Is it correct? If no, repeat. Done.
The second way has a much greater quantity of results, because it is a simpler, and quicker, way to make a figure of a man. The superiority of the iterative approach comes in when it is clear you will not get it right in any one try. If I make no mistakes, I may take 24 steps in an iterative approach where I would take 2 in the one go approach. The steps are, however, not equal in length (the iterative steps are much shorter). Let us make it 8 to 2.
This still looks like a slam dunk for the all at once approach. With errors, it stays that way with a low quality standard. If any of the first three are right for it, all at once is still better. Once an extremely high quality standard for the end result is used, however, it is quite likely that not one of the first 25 all at once clay men will be good enough. Even with a high standard, iteration is likely to not need an amplification of more than 2 or 3. 50 versus 24 now makes it a slam dunk in favor of iteration.
In the end, both methods actually would give about the same amount of experience, (though I don’t have space to justify that here, and good arguments against it could be made) almost regardless of the number of steps necessary. Somewhere in the middle must be a crossover point, between iteration and doing it all at once. It behooves us to figure out where we are in relation to it.
The long-winded point here is that quantity (iteration) can produce high quality quicker than quality (get it right straight up), but only some of the time. A low quality standard is analogous to your low cost of failure, whereas the high cost of failure is likewise to the high quality standard. For beginners, the standard is low, and just doing it is probably the best way (though they still need to understand in order to make a real attempt).
For pure personal learning, among the experienced, it is far trickier to be sure. True failure is highly costly, as you learn the wrong thing, but it also is less likely, and minor failures can be learned from.
I’m relatively sure Jeff Atwood understands all of this, of course, but it isn’t immediately obvious from his writing here. I’m not some guru, and this isn’t diving wisdom wither, but it is always a good idea to keep in mind what is so basic, that the expert has forgotten to mention it exists. After all, he is here to push for the deviation, not simply the status quo.