If you let a program store one more binary bit of information, it will be able to cut down a space of possibilities by half, and hence assign twice as much probability to all the points in the remaining space. This suggests that one bit of program complexity should cost at least a “factor of two gain” in the fit. If you try to design a computer program that explicitly stores an outcome like “HTTHHT”, the six bits that you lose in complexity must destroy all plausibility gained by a 64-fold improvement in fit. Otherwise, you will sooner or later decide that all fair coins are fixed.
I found this paragraph confusing. How about
If you let a program store one more binary bit of information, it will be able to cut down a space of possibilities by half, and hence assign twice as much probability to all the points in the remaining space. This suggests that one bit of program complexity should always buy at least a “factor of two gain” in the fit. If you try to design a computer program that explicitly stores an outcome like “HTTHHT”, the six bits that you pay in complexity must get you at least a 64-fold improvement in fit. Otherwise, you will sooner or later decide that all fair coins are fixed.
I found this paragraph confusing. How about
Does that mean the same thing?