Clearly as complexity goes to infinity the probability goes to 0. (i.e. For any epsilon>0 there exists n such that all hypotheses with complexity>n have probability less than epsilon. (since otherwise there would be infinitely many hypotheses with probability >epsilon, taking 1/epsilon of them would produce a collection of hypotheses with probability>1))
So as the post says, “on average” Occam’s razor must be true. (As must any other Razor using a different measure of complexity, or flubbity, or whatever. However, message length seems to me to be the only “nice” measure to use on hypotheses.) But it doesn’t say that the simplest hypotheses have to be ranked in complexity order. Of course it would be elegant if they were.
Well taking this to be true, it seems like we could even construct an ordering where we enumerate algorithms based on something that maps to inverse complexity, so this would prove the opposite of Occam’s razor.
But I can’t think of any such ordering, and I’ve got a proof kicking around in my head that there can be no such ordering, so this hunch is probably wrong.
For any given inverse complexity, infinitely many algorithms have a lower inverse complexity. This seems to break any attempt to enumerate in that order, or assign decreasing nonzero probabilities that sum to 1.
Clearly as complexity goes to infinity the probability goes to 0. (i.e. For any epsilon>0 there exists n such that all hypotheses with complexity>n have probability less than epsilon. (since otherwise there would be infinitely many hypotheses with probability >epsilon, taking 1/epsilon of them would produce a collection of hypotheses with probability>1))
So as the post says, “on average” Occam’s razor must be true. (As must any other Razor using a different measure of complexity, or flubbity, or whatever. However, message length seems to me to be the only “nice” measure to use on hypotheses.) But it doesn’t say that the simplest hypotheses have to be ranked in complexity order. Of course it would be elegant if they were.
Why should we favour the elegant explanation?
Occam’s razor of course!
Well taking this to be true, it seems like we could even construct an ordering where we enumerate algorithms based on something that maps to inverse complexity, so this would prove the opposite of Occam’s razor.
But I can’t think of any such ordering, and I’ve got a proof kicking around in my head that there can be no such ordering, so this hunch is probably wrong.
For any given inverse complexity, infinitely many algorithms have a lower inverse complexity. This seems to break any attempt to enumerate in that order, or assign decreasing nonzero probabilities that sum to 1.
Indeed there is no such ordering.