Dremeling
In high school, I was on a robotics team (FRC) who, in honesty, were basically incompetent. (Me included.) We were bad at building a robot, bad at programming it, bad at design, but most relevantly for this post bad at planning ahead.
The particular instance which added a term to my internal monologue was this: We needed to cut slots in some aluminum in order to do...something, I don’t remember what. It needed to be about a foot long, so it clearly was worth getting a dedicated tool—I forget what that was, probably a jigsaw. But we didn’t have one and it was going to take a week or so to arrive.
What we did have, was a dremel tool. Dremels are small, portable grinders, usable for many things but rarely the best tool for the job. Using a dremel as a slot cutter is workable, but inefficient; it quickly ruins the bit and takes a long time. Nonetheless, despite knowing that the job could be done much more quickly and efficiently if they waited a week, people on our team repeatedly returned to dremeling the slots.
I think this is a common pattern. It’s similar to the politician’s syllogism (“Something must be done. This is something. Therefore, this must be done.”), and to Lost Purposes, but distinct. “Dremeling” is the act of doing something you know is a bad way to attack the problem, and probably a waste of time, because it moves you incrementally towards the goal, and you can’t do anything more helpful right now.
There are better ways; you know there are better ways and have at least a pretty good idea of what they are and how you could switch to one of those ways. But, in the moment, the better ways are unavailable and the bad way is available. And so you do the (dumb, wasteful, barely-helpful) thing you can do right now, because it will technically make progress.
Why does this happen? Well, not doing anything is uncomfortable. It often feels like failure. If others see you not doing anything, it may open you up to shame; at the national scale this is the politician’s syllogism, but the politics of a peer group are just as terrifying—and probably more terrifying—than the party politics of a large polity.
So, as we learn from (Brooks, Reiner, 1975), it stems from fear.
A more common term for this might be “precrastination”.
I think that’s different but I’ve having difficulty articulating how, so I may be wrong about that.
The question is if dremeling is actually worse than sitting around doing nothing? If there is programming that you could be getting on with, or something like that, then go and do that, be back in a week. But if making those slots is part of the critical path, and the speed-up is more important than any loss of quality, go ahead and dremel.
So in the specific original instance, I think there were actually better things we could have been doing. But Edison’s favorite aphorism applies: That would have required thinking.
Possibly there is some more detailed analogy to be made about confronting uncertainty and tending to under-explore and over-exploit.
Almost no human actions are anywhere near optimal towards optimising long term real world goals. Optimal seeking of long term real world goals looks like a superintelligence seeking the minimum time path to nanotechnology.
There are always better things you could be doing, the question is if you will think of a better thing to do given a little more thinking?
-
I like this example! And the word is cool. I see two separately important patterns here:
Preferring a single tool (the dremel) which is mediocre at everything, instead of many specialized tools which collectively perform better but which require you to switch between them more.
This btw is the opposite of “horizontal segmentation”: selling several specialized products to niche markets rather than a single product which appeals moderately to all niches.
It often becomes a problem when the proxy you use to measure/compare the utility of something wrt to different use-cases (or its appeal to different niches/markets), is capped[1] at a point which prevents it from detecting the true comparative differences in utility.
Oh! It very much relates to scope insensitivity: if people are diminishingly sensitive to the scale of different altruistic causes, then they might overprioritize instrumental which are just-above-average along many axes at once.[2] And indeed, this seems like a very common pattern (though I won’t prioritize time thinking of examples rn).
It’s also a significant problem wrt to karma distributions for forums like LW and EAF: posts which appeal a little to everybody will receive much more karma compared to posts which appeal extremely to a small subset. Among other things, this causes community posts to be overrated relative to their appeal.
And as Gwern pointed out: “precrastination” / “hastening of subgoal completion” (a subcategory of greedy optimization / myopia).
I very often notice this problem in my own cognition. For example, I’m biased against using cognitive tools like sketching out my thoughts with pen-and-paper when I can just brute-force the computations in my head (less efficiently).
It’s also perhaps my biggest bottleneck wrt programming. I spend way too much time tweaking-and-testing (in a way that doesn’t cause me learn anything generalizable), instead of trying to understand the root cause of the bug I’m trying to solve even when I can rationally estimate that that will take less time in expectation.
If anybody knows any tricks for resolving this / curing me of this habit, I’d be extremely gratefwl to know...
Does it relate to price ceilings and deadweight loss? “Underparameterization”?
I wouldn’t have seen this had I not cultivated a habit for trying to describe interesting patterns in their most general form—a habit I call “prophylactic scope-abstraction”.
-