Create a compressed version (self-extracting archive) of the 100MB file enwik8 of less than about 16MB. More precisely:
[etc]
There are restrictions on runspeed, but they seem to be there to limit brute forcing all possible approaches. There is also talk of “compressing intelligently.” I am not even sure what a pure speed test of compression would mean—the identity function is pretty fast!
There are linear time implementations of compression schemes that reach the Shannon bound, so any runtime improvement of an asymptotically optimal encoder will not be dramatic either...
This is my point—the algorithm challenge is to do a given compression task quickly, and Shannon has no limits I’m aware of on how fast that can be done. So there’s no problem of approaching physical limits the way there would be for, say, “Encode this file in the smallest space possible”.
Edit: The stuff below the line either wasn’t there when I replied or I didn’t see it. You’re right though, if linear-time implementations already exist, then you’re basically hard-bounded.
???
From the url:
Create a compressed version (self-extracting archive) of the 100MB file enwik8 of less than about 16MB. More precisely:
[etc]
There are restrictions on runspeed, but they seem to be there to limit brute forcing all possible approaches. There is also talk of “compressing intelligently.” I am not even sure what a pure speed test of compression would mean—the identity function is pretty fast!
There are linear time implementations of compression schemes that reach the Shannon bound, so any runtime improvement of an asymptotically optimal encoder will not be dramatic either...
This is my point—the algorithm challenge is to do a given compression task quickly, and Shannon has no limits I’m aware of on how fast that can be done. So there’s no problem of approaching physical limits the way there would be for, say, “Encode this file in the smallest space possible”.
Edit: The stuff below the line either wasn’t there when I replied or I didn’t see it. You’re right though, if linear-time implementations already exist, then you’re basically hard-bounded.
To be sure, constant factors matter a lot! But I think in this case the curve would be pretty boring.