Yikes, I see why—I worded the concept quite poorly. The example I was trying to describe is in software engineering, where you have an ancient crufty mess that you’re trying to rewrite in some snazzy new language. You think you can rewrite it and make it super simple, and so you write the new thing the simple way that “should work”, but when you run the old code’s tests against it (or when you put it to use in the real world...) you discover that the reason the old code was such a mess was partly that it had a bunch of logic to handle various edge cases that the application had hit in the past.
An alternative phrasing might be: “Where are the gaps between how I think the status quo ‘should’ work, and how it actually does?”. Often, established systems are silently compensating for all kinds of problems that happen infrequently enough for any one person to forget that the problem exists when trying to replace the system.
Yikes, I see why—I worded the concept quite poorly. The example I was trying to describe is in software engineering, where you have an ancient crufty mess that you’re trying to rewrite in some snazzy new language. You think you can rewrite it and make it super simple, and so you write the new thing the simple way that “should work”, but when you run the old code’s tests against it (or when you put it to use in the real world...) you discover that the reason the old code was such a mess was partly that it had a bunch of logic to handle various edge cases that the application had hit in the past.
An alternative phrasing might be: “Where are the gaps between how I think the status quo ‘should’ work, and how it actually does?”. Often, established systems are silently compensating for all kinds of problems that happen infrequently enough for any one person to forget that the problem exists when trying to replace the system.