Good point. My answer is: yes, we have to accept a speed/accuracy tradeoff. That doesn’t seem like such a disaster in practice.
Some people, primarily Matt Mahoney, have actually organized data compression contests similar to what I’m advocating. Mahoney’s solution is just to impose a certain time limit that is reasonable but arbitrary. In the future, researchers could develop a spectrum of theories, each of which achieves a non-dominated position on a speed/compression curve. Unless something Very Strange happened, each faster/less accurate theory would be related to its slower/more accurate cousin by a standard suite of approximations. (It would be strange—but interesting—if you could get an accurate and fast theory by doing a nonstandard approximation or introducing some kind of new concept).
Good point. My answer is: yes, we have to accept a speed/accuracy tradeoff. That doesn’t seem like such a disaster in practice.
Some people, primarily Matt Mahoney, have actually organized data compression contests similar to what I’m advocating. Mahoney’s solution is just to impose a certain time limit that is reasonable but arbitrary. In the future, researchers could develop a spectrum of theories, each of which achieves a non-dominated position on a speed/compression curve. Unless something Very Strange happened, each faster/less accurate theory would be related to its slower/more accurate cousin by a standard suite of approximations. (It would be strange—but interesting—if you could get an accurate and fast theory by doing a nonstandard approximation or introducing some kind of new concept).