Summary:
Doerner draws on historical and laboratory data to illustrate generic features of many/most (I’d say “all”) failures of judgment in complex situations. He offers suggestions on how to overcome our predisposition to failure in these situations.
Lessons:
This book is a treasure trove. It ranges broadly—complexity, goals, models, time, ignorance, planning, and more. I can’t emphasize enough how illuminating (and readable!) this book is.
Here are some quotes from my notes on the first half of the book:
“When we fail to solve a problem, we fail because we tend to make a small mistake here, a small mistake there, and these mistakes add up.” (p. 7),
”...it is far from clear whether “good intentions plus stupidity” versus “evil intentions plus intelligence” have wrought more harm in the world.” (p. 8), “If, the moment something goes wrong, we no longer hold ourselves responsible but push the blame onto others, we guarantee that we remain ignorant of the real reasons for poor decisions, namely inadequate plans and failure to anticipate the consequences.” (p. 27), “We find an inability to think in terms of nonlinear networks of causation rather than chains of causation—an inability, that is, to properly assess the side effects and repercussions of one’s behavior. We find an inadequate understanding of exponential development, an inability to see that a process that develops exponentially will, once it has begun, race to its conclusion with incredible speed. These are all mistakes of cognition.” (p. 33), “[Characteristics of analysis of complicated systems:] complexity, intransparence, internal dynamics” (p. 37), ”...the ability to make allowances for incomplete and incorrect information and hypotheses is an important requirement for dealing with complex situations.” (p. 42), “Formless collections of data about random aspects of a situation merely add to the situation’s impenetrability and are no aid to decision making.” (p. 44), ”...goals may be: positive or negative, general or specific, clear or unclear, simple or multiple, implicit or explicit” (p. 52), “By labeling a bundle of problems with a single conceptual label, we make dealing with that problem easier—provided we’re not interested in solving it. Phrases like “urgently needed measures for combating unemployment” roll easily off the tongue if we don’t have to do anything about unemployment.” (p. 55), “There are many ways of tackling multiple problems at once, but the one thing we usually cannot do is solve all the problems at once.” (p. 55), “Goals conflict with one another not by their very nature but because the variables relating to them in the system are negatively linked.” (p. 57), “[The problems of DDT (e.g.)] So the mistake is less not knowing than not wanting to know. And not wanting to know is a result not of ill will or egoism but of thinking that focuses on an immediately acute problem.” (p. 58)
The Logic of Failure, by Dietrich Doerner.
Summary: Doerner draws on historical and laboratory data to illustrate generic features of many/most (I’d say “all”) failures of judgment in complex situations. He offers suggestions on how to overcome our predisposition to failure in these situations.
Lessons: This book is a treasure trove. It ranges broadly—complexity, goals, models, time, ignorance, planning, and more. I can’t emphasize enough how illuminating (and readable!) this book is.
Here are some quotes from my notes on the first half of the book: “When we fail to solve a problem, we fail because we tend to make a small mistake here, a small mistake there, and these mistakes add up.” (p. 7), ”...it is far from clear whether “good intentions plus stupidity” versus “evil intentions plus intelligence” have wrought more harm in the world.” (p. 8), “If, the moment something goes wrong, we no longer hold ourselves responsible but push the blame onto others, we guarantee that we remain ignorant of the real reasons for poor decisions, namely inadequate plans and failure to anticipate the consequences.” (p. 27), “We find an inability to think in terms of nonlinear networks of causation rather than chains of causation—an inability, that is, to properly assess the side effects and repercussions of one’s behavior. We find an inadequate understanding of exponential development, an inability to see that a process that develops exponentially will, once it has begun, race to its conclusion with incredible speed. These are all mistakes of cognition.” (p. 33), “[Characteristics of analysis of complicated systems:] complexity, intransparence, internal dynamics” (p. 37), ”...the ability to make allowances for incomplete and incorrect information and hypotheses is an important requirement for dealing with complex situations.” (p. 42), “Formless collections of data about random aspects of a situation merely add to the situation’s impenetrability and are no aid to decision making.” (p. 44), ”...goals may be: positive or negative, general or specific, clear or unclear, simple or multiple, implicit or explicit” (p. 52), “By labeling a bundle of problems with a single conceptual label, we make dealing with that problem easier—provided we’re not interested in solving it. Phrases like “urgently needed measures for combating unemployment” roll easily off the tongue if we don’t have to do anything about unemployment.” (p. 55), “There are many ways of tackling multiple problems at once, but the one thing we usually cannot do is solve all the problems at once.” (p. 55), “Goals conflict with one another not by their very nature but because the variables relating to them in the system are negatively linked.” (p. 57), “[The problems of DDT (e.g.)] So the mistake is less not knowing than not wanting to know. And not wanting to know is a result not of ill will or egoism but of thinking that focuses on an immediately acute problem.” (p. 58)