I have observed similar behavior in others. Only I called it ‘blackboxing’, for lack of a better word. I think this might actually be a slightly better term than ‘learned blankness’, so I hereby submit it for consideration. It’s borrowed from the software engineering idea of a black box abstraction.
People tend to create conceptual black boxes around certain processes, which they are remarkably reluctant to look within and explore, even when something does go wrong. This is what seems to have happened with the dishwasher incident. The dishwasher was treated as a black box. Its input was dirty dishes, its output was clean ones. When it malfunctioned, it was hard to see it as anything else. The black box was broken.
Of course, engineers and programmers often go out of their way to design highly opaque black boxes, so it’s not surprising that we fall victim to this behavior. This is often said to be done in the name of simplicity (the ‘user’ is treated as an inept, lazy moron), but I think an additional, more surreptitious reason, is to keep profit margins high. Throwing out a broken dishwasher and buying a new one is far more profitable to a manufacturer than making it easy for the users to pick it apart and fix it themselves.
The open source movement is one of the few prominent exceptions to this that I know of.
This is often said to be done in the name of simplicity (the ‘user’ is treated as an inept, lazy moron), but I think an additional, more surreptitious reason, is to keep profit margins high.
There’s also one much more important reason. To quote A. Whitehead,
Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle — they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.
Humans (right now) just don’t have enough cognitive power to understand every technology in detail. If not for the black boxes, one couldn’t get anything done today.
The real issue is, whether we’re willing to peek inside the box when it misbehaves.
I have observed similar behavior in others. Only I called it ‘blackboxing’, for lack of a better word. I think this might actually be a slightly better term than ‘learned blankness’, so I hereby submit it for consideration. It’s borrowed from the software engineering idea of a black box abstraction.
People tend to create conceptual black boxes around certain processes, which they are remarkably reluctant to look within and explore, even when something does go wrong. This is what seems to have happened with the dishwasher incident. The dishwasher was treated as a black box. Its input was dirty dishes, its output was clean ones. When it malfunctioned, it was hard to see it as anything else. The black box was broken.
Of course, engineers and programmers often go out of their way to design highly opaque black boxes, so it’s not surprising that we fall victim to this behavior. This is often said to be done in the name of simplicity (the ‘user’ is treated as an inept, lazy moron), but I think an additional, more surreptitious reason, is to keep profit margins high. Throwing out a broken dishwasher and buying a new one is far more profitable to a manufacturer than making it easy for the users to pick it apart and fix it themselves.
The open source movement is one of the few prominent exceptions to this that I know of.
There’s also one much more important reason. To quote A. Whitehead,
Humans (right now) just don’t have enough cognitive power to understand every technology in detail. If not for the black boxes, one couldn’t get anything done today.
The real issue is, whether we’re willing to peek inside the box when it misbehaves.