Doesn’t it take more bits to specify NOT P than to specify P? I mean, I can take any proposition and add ”..., and I like pudding” but this doesn’t mean that half of all n-bit propositions are about me liking pudding.
Doesn’t it take more bits to specify NOT P than to specify P?
No. If “NOT P” took more bits to specify than “P”, this would also mean that “NOT NOT P” would take more bits to specify than “NOT P”. But NOT NOT P is identical to P, so it would mean that P takes more bits to specify than itself.
With actual propositions now, instead of letters:
If you have the proposition “The Moon is Earth’s satellite”, and the proposition “The Moon isn’t Earth’s satellite”, each is the negation of the other. If a proposition’s negation takes more bits to specify than the proposition, then you’re saying that each statement takes more bits to specify than the other.
Even simpler—can you think any reason why it would necessarily take more bits to codify “x != 5” than “x == 5″?
We’re talking about minimum message length, and the minimum message of NOT NOT P is simply P.
Once you consider double negation, I don’t have any problem with saying that
“the Moon is Earth’s satellite”
is a simpler proposition than
“The following statement is false: the Moon is Earth’s satellite”
The abstract syntax tree for “x != 5” is bigger than the AST of “x == 5“. One of them uses numeric equality only, the other uses numeric equality and negation. I expect, though I haven’t verified, that the earliest, simplest compilers generated more processor instructions to compute “x != 5” than they did to compute “x == 5”
Aris is right. NOT is just an operator that flips a bit. Take a single bit: 1. Now apply NOT. You get 0. Or you could have a bit that is 0. Now apply NOT. You get 1. Same number of bits. Truth tables for A and ~A are the same size.
We’re talking about minimum message length, and the minimum message of NOT NOT P is simply P.
That’s what i said. But you also said that NOT P takes more bits to specify than P. You can’t have it both ways.
You don’t understand this point. If I’ve already communicated P to you—do you need any further bits of info to calculate NOT P? No: Once you know P, NOT P is also perfectly well defined, which means that NOT P by necessity has the SAME message length as P.
The abstract syntax tree for “x != 5” is bigger than the AST of “x == 5″.
You aren’t talking about minimum message length anymore, you’re talking about human conventions. One might just as well reply that since “No” is a two-letter word that means rejection takes less bits to encode than the confirmation of “Yes” which is a three-letter word.
I expect, though I haven’t verified, that the earliest, simplest compilers generated more processor instructions to compute “x != 5” than they did to compute “x == 5″
If we have a computer that evaluates statements and returns 1 for true and 0 for false—we can just as well imagine that it returns 0 for true and 1 for false and calculates the negation of those statements. In fact you wouldn’t be able to KNOW whether the computer calculates the statements or their negation, which means when you’re inputting a statement, it’s the same as inputting its negation.
I think I get it. You need n bits of evidence to evaluate a statement whose MML is n bits long. Once you know the truth value of P, you don’t need any more evidence to compute NOT(P), so MML(P) has to equal MML(NOT(P)). In the real world we tend to care about true statements more than false statements, so human formalisms make it easier to talk about truths rather than falsehoods. But for every such formalism, there is an equivalent one that makes it easier to talk about false statements.
I think I had confused the statement of a problem with the amount of evidence needed to evaluate it. Thanks for the correction!
Doesn’t it take more bits to specify NOT P than to specify P? I mean, I can take any proposition and add ”..., and I like pudding” but this doesn’t mean that half of all n-bit propositions are about me liking pudding.
No. If “NOT P” took more bits to specify than “P”, this would also mean that “NOT NOT P” would take more bits to specify than “NOT P”. But NOT NOT P is identical to P, so it would mean that P takes more bits to specify than itself.
With actual propositions now, instead of letters:
If you have the proposition “The Moon is Earth’s satellite”, and the proposition “The Moon isn’t Earth’s satellite”, each is the negation of the other. If a proposition’s negation takes more bits to specify than the proposition, then you’re saying that each statement takes more bits to specify than the other.
Even simpler—can you think any reason why it would necessarily take more bits to codify “x != 5” than “x == 5″?
We’re talking about minimum message length, and the minimum message of NOT NOT P is simply P.
Once you consider double negation, I don’t have any problem with saying that
“the Moon is Earth’s satellite”
is a simpler proposition than
“The following statement is false: the Moon is Earth’s satellite”
The abstract syntax tree for “x != 5” is bigger than the AST of “x == 5“. One of them uses numeric equality only, the other uses numeric equality and negation. I expect, though I haven’t verified, that the earliest, simplest compilers generated more processor instructions to compute “x != 5” than they did to compute “x == 5”
Aris is right. NOT is just an operator that flips a bit. Take a single bit: 1. Now apply NOT. You get 0. Or you could have a bit that is 0. Now apply NOT. You get 1. Same number of bits. Truth tables for A and ~A are the same size.
That’s what i said. But you also said that NOT P takes more bits to specify than P. You can’t have it both ways.
You don’t understand this point. If I’ve already communicated P to you—do you need any further bits of info to calculate NOT P? No: Once you know P, NOT P is also perfectly well defined, which means that NOT P by necessity has the SAME message length as P.
You aren’t talking about minimum message length anymore, you’re talking about human conventions. One might just as well reply that since “No” is a two-letter word that means rejection takes less bits to encode than the confirmation of “Yes” which is a three-letter word.
If we have a computer that evaluates statements and returns 1 for true and 0 for false—we can just as well imagine that it returns 0 for true and 1 for false and calculates the negation of those statements. In fact you wouldn’t be able to KNOW whether the computer calculates the statements or their negation, which means when you’re inputting a statement, it’s the same as inputting its negation.
I think I get it. You need n bits of evidence to evaluate a statement whose MML is n bits long. Once you know the truth value of P, you don’t need any more evidence to compute NOT(P), so MML(P) has to equal MML(NOT(P)). In the real world we tend to care about true statements more than false statements, so human formalisms make it easier to talk about truths rather than falsehoods. But for every such formalism, there is an equivalent one that makes it easier to talk about false statements.
I think I had confused the statement of a problem with the amount of evidence needed to evaluate it. Thanks for the correction!
A big thumbs up for you, and you’re very welcome! :-)