I have been thinking about the same post about the Occam razor, and stuck in question: “what is the medium complexity of the true hypothesis from all the field of the hypotheses”. I hope that my question is clear without the need to longer explained what did I mean. Anyway, I will try to explain it a little bit.
Occam Razor doesn’t say that simplest hypothesis is true. It just says that probability of truth is diminishing as the complexity of the hypothesis is growing. It is clear that most times the true hypothesis will be somewhere after pN1+pN2 +pNn=0.5, where n is the number of hypotheses ranged by their complexity, and p(N) is the probability that a given hypothesis is true according to Occam razor principle.
I also have a feeling that in EY writing it is always assumed that Occam razor law is diminishing very quickly and so most simple hypothesis has an overwhelming probability to be true. However, I also have a feeling that in real life medium complexity hypotheses are dominating, like somewhere near 100 from the beginning. It results in much more complex and unpredictable world.
It looks like you have been thinking in the similar direction—do you have any ideas about the medium complexity of true hypotheses?
“Simple” in O’s R means “simplest that explains the facts at all”. So you delete a bunch of hypotheses that are too simple to be explanatorily adequate, and then you delete the ones that are unnecessarily complex. That gives you soem sort of medium complexity.
My question was more about the medium length of an algorithm predicted by Solomonov induction.
Update: According to https://wiki.lesswrong.com/wiki/Solomonoff_induction the weight of hypothesis is diminishing very quickly, like 2power(-n), where n is the program length. And in this case, the medium level will somewhere between first and second hypothesis.
I have been thinking about the same post about the Occam razor, and stuck in question: “what is the medium complexity of the true hypothesis from all the field of the hypotheses”. I hope that my question is clear without the need to longer explained what did I mean. Anyway, I will try to explain it a little bit.
Occam Razor doesn’t say that simplest hypothesis is true. It just says that probability of truth is diminishing as the complexity of the hypothesis is growing. It is clear that most times the true hypothesis will be somewhere after pN1+pN2 +pNn=0.5, where n is the number of hypotheses ranged by their complexity, and p(N) is the probability that a given hypothesis is true according to Occam razor principle.
I also have a feeling that in EY writing it is always assumed that Occam razor law is diminishing very quickly and so most simple hypothesis has an overwhelming probability to be true. However, I also have a feeling that in real life medium complexity hypotheses are dominating, like somewhere near 100 from the beginning. It results in much more complex and unpredictable world.
It looks like you have been thinking in the similar direction—do you have any ideas about the medium complexity of true hypotheses?
“Simple” in O’s R means “simplest that explains the facts at all”. So you delete a bunch of hypotheses that are too simple to be explanatorily adequate, and then you delete the ones that are unnecessarily complex. That gives you soem sort of medium complexity.
My question was more about the medium length of an algorithm predicted by Solomonov induction. Update: According to https://wiki.lesswrong.com/wiki/Solomonoff_induction the weight of hypothesis is diminishing very quickly, like 2power(-n), where n is the program length. And in this case, the medium level will somewhere between first and second hypothesis.
Are you using medium to mean median?
Yes, by mistake :(