I meant that at, perhaps, a more basic level. Taboo “calculus”.
If you ask a programmer for an “estimate” (most often, of the time a given task will take) everyone thinks it natural to give you a single number.
That’s what I did along with everybody else. It came as a shock to me the first time I stumbled across the idea of expressing estimates with degrees of confidence: that is, my 50% confidence estimate should be such that half of the time I’d be early and half of the time I’d be late. (To many a programmer, the notion of finishing early compared to an estimate is counter-intuitive in its own right.)
Still later I hit upon (more accurately, got it hammered into my head) the idea that you could represent an estimate most accurately as a distribution of frequencies of finishing a similar task in time t. Taking t as a continuous parameter you could represent the estimate as a curve.
And still later I figured out that you could interpret the same curve as a probability density for a single task.
This series of insights unlocked for me a great deal of clarity about why project planning so often went wrong, and practical ideas for doing something about it.
In all this time I don’t think I learned anything about calculus that I didn’t know before. I had and still have no idea what a Radon-Nikodym theorem is (but I’ll check it out), and only the vaguest notion of what measure theory is.
I’ve rarely had occasion to wish that more programmers could recall offhand, say, what the integral of e^x is, or how to apply the product or chain rules, or how to get a Taylor series expansion.
the need for the ideas in their head to form a coherent logical structure
That’s actually what turned me off math years ago. As a computer programmer I’m most comfortable with the style where you first define something before you use it, and build gradually toward higher levels of abstraction. In school this was often seriously compromised in favor of “memorize this and never mind how to construct it in the first place”.
I meant that at, perhaps, a more basic level. Taboo “calculus”.
If you ask a programmer for an “estimate” (most often, of the time a given task will take) everyone thinks it natural to give you a single number.
That’s what I did along with everybody else. It came as a shock to me the first time I stumbled across the idea of expressing estimates with degrees of confidence: that is, my 50% confidence estimate should be such that half of the time I’d be early and half of the time I’d be late. (To many a programmer, the notion of finishing early compared to an estimate is counter-intuitive in its own right.)
Still later I hit upon (more accurately, got it hammered into my head) the idea that you could represent an estimate most accurately as a distribution of frequencies of finishing a similar task in time t. Taking t as a continuous parameter you could represent the estimate as a curve.
And still later I figured out that you could interpret the same curve as a probability density for a single task.
This series of insights unlocked for me a great deal of clarity about why project planning so often went wrong, and practical ideas for doing something about it.
In all this time I don’t think I learned anything about calculus that I didn’t know before. I had and still have no idea what a Radon-Nikodym theorem is (but I’ll check it out), and only the vaguest notion of what measure theory is.
I’ve rarely had occasion to wish that more programmers could recall offhand, say, what the integral of e^x is, or how to apply the product or chain rules, or how to get a Taylor series expansion.
That’s actually what turned me off math years ago. As a computer programmer I’m most comfortable with the style where you first define something before you use it, and build gradually toward higher levels of abstraction. In school this was often seriously compromised in favor of “memorize this and never mind how to construct it in the first place”.