I think what he’s saying is that the existence of noise in computing hardware means that any computation done on this hardware must be (essentially) invariant to this noise, which leads the methods away from the precise, all-or-nothing logic of discrete math and into the fuzzier, smoother logic of probability distributions and the real line. This makes me think of analog computing, which is often done in environments with high noise and can indeed produce computations that are mostly invariant to it.
But, of course, analog computing is a niche field dwarfed by digital computing, making this prediction of von Neumann’s comically wrong: the solution people went with wasn’t to re-imagine all computations in a noise-invariant way, it was to improve the hardware to the point that the noise becomes negligible. But I don’t want to sound harsh here at all. The prediction was so wrong only because, at least as far as I know, there was no reasonable way to predict the effect that transistors would have on computing in the ’50s since they were not invented until around then. It seems reasonable from that vantage point to expect creative improvements in mathematical methods before a several-orders-of-magnitude improvement in hardware accuracy.
I think what he’s saying is that the existence of noise in computing hardware means that any computation done on this hardware must be (essentially) invariant to this noise, which leads the methods away from the precise, all-or-nothing logic of discrete math and into the fuzzier, smoother logic of probability distributions and the real line. This makes me think of analog computing, which is often done in environments with high noise and can indeed produce computations that are mostly invariant to it.
But, of course, analog computing is a niche field dwarfed by digital computing, making this prediction of von Neumann’s comically wrong: the solution people went with wasn’t to re-imagine all computations in a noise-invariant way, it was to improve the hardware to the point that the noise becomes negligible. But I don’t want to sound harsh here at all. The prediction was so wrong only because, at least as far as I know, there was no reasonable way to predict the effect that transistors would have on computing in the ’50s since they were not invented until around then. It seems reasonable from that vantage point to expect creative improvements in mathematical methods before a several-orders-of-magnitude improvement in hardware accuracy.