Compression is closely related to forecasting—which in turn is a large part of intelligence (my estimate is 80% forecasting vs 20% pruning and evaluation). So your thesis bears on the idea of an intelligence explosion.
Well, creatures able to compress their sense data to 0.22 of its original size might drive to extinction creatures who can only manage a compression ratio of 0.23 - in an evolutionary contest. ‘Small’ differences in modeling ability—as measured by compression ratios—could thus have large effects on the world.
However compression (and AI) are hard problems and run rapidly into diminishing returns—at least if you measure them in this way.
Well, creatures able to compress their sense data to 0.22 of its original size might drive to extinction creatures
who can only manage a compression ratio of 0.23 - in an evolutionary contest.
Compression is closely related to forecasting—which in turn is a large part of intelligence (my estimate is 80% forecasting vs 20% pruning and evaluation). So your thesis bears on the idea of an intelligence explosion.
I guess it’s very weak negative evidence (like the fact that NP-complete problems exist, and lots of AI problems are NP-complete).
The steelman of a pro-explosion argument is probably very sophisticated and easily avoids these issues.
Well, creatures able to compress their sense data to 0.22 of its original size might drive to extinction creatures who can only manage a compression ratio of 0.23 - in an evolutionary contest. ‘Small’ differences in modeling ability—as measured by compression ratios—could thus have large effects on the world.
However compression (and AI) are hard problems and run rapidly into diminishing returns—at least if you measure them in this way.
Sounds pretty speculative.