Atul Gawande’s The Checklist Manifesto was originally published in 2009. By the time I read it a few years ago, the hard-earned lessons explained in this book had already trickled into hospitals across North America. It’s easy to look at the core concept and think of it as trivial. For decades, though, it was anything but obvious.
Atul Gawande walks readers through his experience of how the modern medical system fails. The 20th century saw vast increases in medical knowledge, both through a richer understanding of the body, and from swathes of new drugs, tests, and surgical procedures. And yet, mistakes are still made; diagnoses are missed, critical tests aren’t run, standard treatments aren’t given. Even when the right answer is known by someone – and often by everyoneinvolved – patients slip through the cracks.
Fundamentally, medicine knows too much; even decades of medical training are insufficient for a doctor to know everything. A hospitalized patient’s treatment involves coordination between dozens of different specialized professionals and departments. The hospital environment itself is chaotic, time-pressured and filled with interruptions and distractions: far from ideal for human workers making high-stakes decisions. Patients are subjected to many interventions, most of which are complex and carry some risk; the average ICU patient requires roughly 178 daily care tasks (having worked as an ICU nurse myself, I believe it!), so even getting it perfect 99% of the time leaves an average of about two medical errors per day.
Medical professionals know how to perform all 178 of those tasks; they’ve probably done them hundreds if not thousands of times. The failure is one of reliability and diligence – skills for which Atul Gawande has a deep appreciation. In another book, Better, he says:
The first [virtue] is diligence, the necessity of giving sufficient attention to detail to avoid error and prevail against obstacles. Diligence seems an easy and minor virtue. (You just pay attention, right?) But it is neither. Diligence is both central to performance and fiendishly hard.
As Gawande notes, these failures of diligence are far from unique to healthcare. He spends much of the book describing his investigations of other fields and conversations with their various experts. (An adorable Atul Gawande trait is how he’s the sort of person who will befriend the construction crew working on a new wing for the hospital where he works, get himself invited to their offices, and spend multiple pages of his book enthusiastically describing their project management systems.)
Other professions face the same basic problem: the knowledge base and the complexity of the work grows until no single expert can fit all the relevant pieces into their head. The attentional load grows, and getting it right 99% of the time isn’t good enough. Mistakes are made, details go unnoticed, corners are cut by rushed and overworked staff, and (in the medical field at least) people die.
Fortunately for Gawande’s medical practice, he found that other industries had already explored and thoroughly tested some solutions. The basic problem of human reliability in complex situations is one that the airline industry had already discovered in the early 20th century. The US army corps was testing new bomber aircraft designs, and one of these was Boeing’s Model 299. It was a miracle of engineering; it could hold five times as many bombs as the specs had requested, and flew faster and further than any previous plane.
But during its first test flight, on October 30, 1935, the plane crashed. Technically, the equipment functioned perfectly. But the controls were so numerous and complicated that human error was almost inevitable. The pilot, overwhelmed, forgot to release a new locking mechanism on the elevator and rudder controls. As a newspaper at the time wrote, it was “too much airplane for one man to fly.”
The US army air corps chose a different design, sacrificing performance for simplicity.
They didn’t give up, though; they ordered a handful of Model 299s, and handed them over to a team of test pilots, who put their heads together and tried to find a way for pilots to safely fly a plane that was too challenging for even the most highly-trained, expert human brains to handle.
They wrote a checklist. It was an easy list of tasks, ones that all pilots already knew to do – checking instruments, releasing brakes, closing doors and windows, unlocking elevator controls. And yet, however obvious, it made all the difference; the test pilots went on to fly a total of 1.8 million miles of airtime with no accidents, and the army ordered thousands of the aircraft, later called the B-17. One index-card sized checklist ended up giving the US army a decisive advantage in WWII.
A checklist does not need to be long to be useful. One of the first checklists introduced in hospitals, aimed at decreasing central line infections in ICU patients, was trialled at John Hopkins Hospital in Baltimore, in 2001. It had five steps. The doctor was supposed to:
wash their hands with soap
clean the patient’s skin with chlorhexidine antiseptic
put sterile drapes over the entire patient
wear a mask, hat, sterile gown, and gloves
place a sterile dressing over the insertion site once the line was in
All of these were steps that doctors were already supposed to be taking, and not even hard steps. (I know that I rolled my eyes at this list when I was introduced to it.) But hospital staff are busy, stressed, and sleep-deprived – and perfect 100% reliability is nearly impossible for humans even under ideal conditions. Part of the change being introduced was a social one: nurses were responsible for documenting that the doctor had carried out each step, and had a new mandate – and backup from management and hospital administration – to chide doctors who forgot items.
Which, it turned out, made all the difference. In the first ten days of the experiment, the line infection rate went from 11% to zero. Over the next fifteen months, there were two (2) infections total. Compared to projections based on previous rates, the simple protocol prevented 43 infections and eight deaths – not to mention saving the hospital millions of dollars.
And yet, even after decades, checklist-style interventions are not universal. Healthcare is still far less systematized than airline safety (with their booklet of procedures for every single kind of emergency), or construction (with its elaborate project management systems and clear lines of communication for any front-line worker to report concerns to the engineers). As Atul Gawande puts it:
We in medicine continue to exist in a system created in the Master Builder era – a system in which a lone Master Physician with a prescription pad, an operating room, and a few people to follow his lead plans and executes the entirety of care for a patient, from diagnosis through treatment. We’ve been slow to adapt to the reality that, for example, a third of patients have at least ten specialist physicians actively involved in their care by their last year of life, and probably a score more personnel, ranging from nurse practitioners and physician assistants to pharmacists and home medical aides. And the evidence of how slow we’ve been to adapt is the extraordinarily high rate at which care for patients is duplicated or flawed or completely uncoordinated.
From early on, the data looked conclusive; checklists in a hospital setting saved lives. But over and over, Atul Gawande mentions the difficulties he and others faced in getting buy-in from medical staff to adopt new checklists. They were too time-consuming. The items were confusing or ambiguous. The staff rolled their eyes at how stupidly obvious the checklist items were; whatever the data showed, it just didn’t feel like they ought to be necessary.
Making a good human-usable checklist takes a lot of workshopping. Airlines are still constantly revising their 200-page manual of individually optimized checklists for every possible emergency, as plane designs change and new safety data rolls in. (Amusing fact: the six-item checklist for responding to engine failure while flying a single-engine Cessna plane starts with “FLY THE AIRPLANE”.) Gawande and his team spent months refining their surgical safety checklist before they had something usable, and even now, it’s not universally adopted; implementing the list in new hospitals, especially in the developing world, means adjusting it for existing local protocols and habits, available resources, and cultural factors.
But even in the poorest hospitals, using it saves lives. And there’s a broader lesson to be learned, here. In any complex field – which encompasses quite a lot of the modern world – even very obvious, straightforward instructions to check off for routine tasks can cut down on the cognitive overhead and reduce “careless” human error, making perfect performance much more feasible.
Book review: The Checklist Manifesto
Atul Gawande’s The Checklist Manifesto was originally published in 2009. By the time I read it a few years ago, the hard-earned lessons explained in this book had already trickled into hospitals across North America. It’s easy to look at the core concept and think of it as trivial. For decades, though, it was anything but obvious.
Atul Gawande walks readers through his experience of how the modern medical system fails. The 20th century saw vast increases in medical knowledge, both through a richer understanding of the body, and from swathes of new drugs, tests, and surgical procedures. And yet, mistakes are still made; diagnoses are missed, critical tests aren’t run, standard treatments aren’t given. Even when the right answer is known by someone – and often by everyone involved – patients slip through the cracks.
Fundamentally, medicine knows too much; even decades of medical training are insufficient for a doctor to know everything. A hospitalized patient’s treatment involves coordination between dozens of different specialized professionals and departments. The hospital environment itself is chaotic, time-pressured and filled with interruptions and distractions: far from ideal for human workers making high-stakes decisions. Patients are subjected to many interventions, most of which are complex and carry some risk; the average ICU patient requires roughly 178 daily care tasks (having worked as an ICU nurse myself, I believe it!), so even getting it perfect 99% of the time leaves an average of about two medical errors per day.
Medical professionals know how to perform all 178 of those tasks; they’ve probably done them hundreds if not thousands of times. The failure is one of reliability and diligence – skills for which Atul Gawande has a deep appreciation. In another book, Better, he says:
As Gawande notes, these failures of diligence are far from unique to healthcare. He spends much of the book describing his investigations of other fields and conversations with their various experts. (An adorable Atul Gawande trait is how he’s the sort of person who will befriend the construction crew working on a new wing for the hospital where he works, get himself invited to their offices, and spend multiple pages of his book enthusiastically describing their project management systems.)
Other professions face the same basic problem: the knowledge base and the complexity of the work grows until no single expert can fit all the relevant pieces into their head. The attentional load grows, and getting it right 99% of the time isn’t good enough. Mistakes are made, details go unnoticed, corners are cut by rushed and overworked staff, and (in the medical field at least) people die.
Fortunately for Gawande’s medical practice, he found that other industries had already explored and thoroughly tested some solutions. The basic problem of human reliability in complex situations is one that the airline industry had already discovered in the early 20th century. The US army corps was testing new bomber aircraft designs, and one of these was Boeing’s Model 299. It was a miracle of engineering; it could hold five times as many bombs as the specs had requested, and flew faster and further than any previous plane.
But during its first test flight, on October 30, 1935, the plane crashed. Technically, the equipment functioned perfectly. But the controls were so numerous and complicated that human error was almost inevitable. The pilot, overwhelmed, forgot to release a new locking mechanism on the elevator and rudder controls. As a newspaper at the time wrote, it was “too much airplane for one man to fly.”
The US army air corps chose a different design, sacrificing performance for simplicity.
They didn’t give up, though; they ordered a handful of Model 299s, and handed them over to a team of test pilots, who put their heads together and tried to find a way for pilots to safely fly a plane that was too challenging for even the most highly-trained, expert human brains to handle.
They wrote a checklist. It was an easy list of tasks, ones that all pilots already knew to do – checking instruments, releasing brakes, closing doors and windows, unlocking elevator controls. And yet, however obvious, it made all the difference; the test pilots went on to fly a total of 1.8 million miles of airtime with no accidents, and the army ordered thousands of the aircraft, later called the B-17. One index-card sized checklist ended up giving the US army a decisive advantage in WWII.
A checklist does not need to be long to be useful. One of the first checklists introduced in hospitals, aimed at decreasing central line infections in ICU patients, was trialled at John Hopkins Hospital in Baltimore, in 2001. It had five steps. The doctor was supposed to:
wash their hands with soap
clean the patient’s skin with chlorhexidine antiseptic
put sterile drapes over the entire patient
wear a mask, hat, sterile gown, and gloves
place a sterile dressing over the insertion site once the line was in
All of these were steps that doctors were already supposed to be taking, and not even hard steps. (I know that I rolled my eyes at this list when I was introduced to it.) But hospital staff are busy, stressed, and sleep-deprived – and perfect 100% reliability is nearly impossible for humans even under ideal conditions. Part of the change being introduced was a social one: nurses were responsible for documenting that the doctor had carried out each step, and had a new mandate – and backup from management and hospital administration – to chide doctors who forgot items.
Which, it turned out, made all the difference. In the first ten days of the experiment, the line infection rate went from 11% to zero. Over the next fifteen months, there were two (2) infections total. Compared to projections based on previous rates, the simple protocol prevented 43 infections and eight deaths – not to mention saving the hospital millions of dollars.
And yet, even after decades, checklist-style interventions are not universal. Healthcare is still far less systematized than airline safety (with their booklet of procedures for every single kind of emergency), or construction (with its elaborate project management systems and clear lines of communication for any front-line worker to report concerns to the engineers). As Atul Gawande puts it:
From early on, the data looked conclusive; checklists in a hospital setting saved lives. But over and over, Atul Gawande mentions the difficulties he and others faced in getting buy-in from medical staff to adopt new checklists. They were too time-consuming. The items were confusing or ambiguous. The staff rolled their eyes at how stupidly obvious the checklist items were; whatever the data showed, it just didn’t feel like they ought to be necessary.
Making a good human-usable checklist takes a lot of workshopping. Airlines are still constantly revising their 200-page manual of individually optimized checklists for every possible emergency, as plane designs change and new safety data rolls in. (Amusing fact: the six-item checklist for responding to engine failure while flying a single-engine Cessna plane starts with “FLY THE AIRPLANE”.) Gawande and his team spent months refining their surgical safety checklist before they had something usable, and even now, it’s not universally adopted; implementing the list in new hospitals, especially in the developing world, means adjusting it for existing local protocols and habits, available resources, and cultural factors.
But even in the poorest hospitals, using it saves lives. And there’s a broader lesson to be learned, here. In any complex field – which encompasses quite a lot of the modern world – even very obvious, straightforward instructions to check off for routine tasks can cut down on the cognitive overhead and reduce “careless” human error, making perfect performance much more feasible.