I don’t think that’s correct. My best understanding of Godel’s theorem is that if your system of logic is powerful enough to express itself, then you can create a statement like “this sentence is unprovable”. That’s pretty short and doesn’t rely on infiniteness.
The statement “this sentence is unprovable” necessarily includes all information on how to prove things, so it’s always larger than your logical system. It’s usually much larger, because “this sentence” requires some tricks to encode.
I’m not sure how much space it would take to write down formally, and I’m not sure it matters. At worst it’s a few pages, but not entire books, let alone some exponentially huge thing you’d never encounter in reality.
It’s also not totally arbitrary axioms that would never be encountered in reality. There are reasons why someone might want to define the rules of logic within logic, and then 99% of the hard work is done.
But regardless, the interesting thing is that such an unprovable sentence exists at all. That its not possible to prove all true statements with any system of logic. It’s possible that the problem is limited to this single edge case, but for all I know these unprovable sentences could be everywhere. Or worse, that it is possible to prove them, and therefore possible to prove false statements.
I think the halting problem is related, but I don’t see how it’s exactly equivalent. In any case the halting problem work around is totally impractical, since it would take multiple ages of the universe to prove the haltingness of a simple loop. If you are referring to the limited memory version, otherwise I’m extremely skeptical.
At worst it’s a few pages, but not entire books, let alone some exponentially huge thing you’d never encounter in reality.
That’s only if your logical system is simple. If you’re a human, then the system you’re using is probably not a real logical system, and is anyway going to be rather large.
I think the halting problem is related, but I don’t see how it’s exactly equivalent.
I don’t think that’s correct. My best understanding of Godel’s theorem is that if your system of logic is powerful enough to express itself, then you can create a statement like “this sentence is unprovable”. That’s pretty short and doesn’t rely on infiniteness.
The statement “this sentence is unprovable” necessarily includes all information on how to prove things, so it’s always larger than your logical system. It’s usually much larger, because “this sentence” requires some tricks to encode.
To see this another way, the halting problem can be seen as equivalent to Godel’s theorem. But it’s trivially possible to have a program of length X+C that solves the halting problem for all programs of length X, where C is a rather low constant; see https://en.wikipedia.org/wiki/Chaitin’s_constant#Relationship_to_the_halting_problem for how.
I’m not sure how much space it would take to write down formally, and I’m not sure it matters. At worst it’s a few pages, but not entire books, let alone some exponentially huge thing you’d never encounter in reality.
It’s also not totally arbitrary axioms that would never be encountered in reality. There are reasons why someone might want to define the rules of logic within logic, and then 99% of the hard work is done.
But regardless, the interesting thing is that such an unprovable sentence exists at all. That its not possible to prove all true statements with any system of logic. It’s possible that the problem is limited to this single edge case, but for all I know these unprovable sentences could be everywhere. Or worse, that it is possible to prove them, and therefore possible to prove false statements.
I think the halting problem is related, but I don’t see how it’s exactly equivalent. In any case the halting problem work around is totally impractical, since it would take multiple ages of the universe to prove the haltingness of a simple loop. If you are referring to the limited memory version, otherwise I’m extremely skeptical.
That’s only if your logical system is simple. If you’re a human, then the system you’re using is probably not a real logical system, and is anyway going to be rather large.
See http://www.solipsistslog.com/halting-consequences-godel/