It’s not so much a way to understand the world as it is a deep engineering principle. You want to design systems in such a way that if there is a problem, the system fails sooner rather than later. One very counter-intuitive technique used in software engineering is the assertion, which has no effect other than to shut the system down if a certain expression evaluates to false. Naively, it seems crazy that we would ever want to add to the set of conditions that will cause a crash—but that is exactly the point.
Failfast is part of the more general strategy expressed by the slogan tighten your feedback loops. If you are doing something stupid, you want to know as soon as possible. This is hard for humans to do, since we have a natural aversion to bad news and discipline.
Can you expand on this? How has this concept helped you understand things?
It’s not so much a way to understand the world as it is a deep engineering principle. You want to design systems in such a way that if there is a problem, the system fails sooner rather than later. One very counter-intuitive technique used in software engineering is the assertion, which has no effect other than to shut the system down if a certain expression evaluates to false. Naively, it seems crazy that we would ever want to add to the set of conditions that will cause a crash—but that is exactly the point.
Failfast is part of the more general strategy expressed by the slogan tighten your feedback loops. If you are doing something stupid, you want to know as soon as possible. This is hard for humans to do, since we have a natural aversion to bad news and discipline.