I still don’t think the programming example supports your point.
For example, in C and C++, integer overflow is undefined behavior. The compiler is allowed to break your program if it happens. Undefined behavior is useful for optimizations—for example, you can optimize x<x+1 to true, which helps eliminate branches—and there have been popular programs that quietly broke when a new compiler release got better at such optimizations. John Regehr’s blog is a great source on this.
Almost nothing in programming is 100% reliable, most things just kinda seem to work. Maybe it would be better to use an example from math.
I still don’t think the programming example supports your point.
For example, in C and C++, integer overflow is undefined behavior. The compiler is allowed to break your program if it happens. Undefined behavior is useful for optimizations—for example, you can optimize x<x+1 to true, which helps eliminate branches—and there have been popular programs that quietly broke when a new compiler release got better at such optimizations. John Regehr’s blog is a great source on this.
Almost nothing in programming is 100% reliable, most things just kinda seem to work. Maybe it would be better to use an example from math.