Right, one of the original solutions, though rarely implemented, is to add a steady stream of defective parts to guarantee optimal human attention. These artificially defective parts are marked in a way that lets them to be automatically separated and recycled later, should any slip by the human QA.
Surely they already do that. The trick is not knowing whether an abnormal input is a drill or not, or at least not knowing when a drill might happen. All these issues have been solved in the military a long time ago.
Knowing when a drill might happen improves alertness during the drill period only. Drills do develop and maintain the skills required to respond to a non-standard situation.
Right, one of the original solutions, though rarely implemented, is to add a steady stream of defective parts to guarantee optimal human attention. These artificially defective parts are marked in a way that lets them to be automatically separated and recycled later, should any slip by the human QA.
Wow. That’s an really cool example of careful design, taking humans into account as well as technical issues.
Yeah, I was equally impressed when one of my instructors at the uni explained the concept, some decades ago, as an aside while teaching CPU design.
They apparently do this in airport x-rays—inject an image of a bag with a gun, to see if the observer reacts.
But apparently not for keeping pilots alert in flight… A “Fuel pressure drop in engine 3!” drill exercise would probably not, umm, fly.
There might be other ways—you could at least do it on simulators, or even on training flights (with no passengers).
Surely they already do that. The trick is not knowing whether an abnormal input is a drill or not, or at least not knowing when a drill might happen. All these issues have been solved in the military a long time ago.
Knowing when a drill might happen improves alertness during the drill period only. Drills do develop and maintain the skills required to respond to a non-standard situation.