What exactly is the aspect of natural numbers that makes them break math, as opposed to other types of values? Intuitively, it seems to be the fact that they can be arbitrarily large but not infinite.
Like, if you invent another data type that only has a finite number of values, it would not allow you to construct something equivalent to Gödel numbering. But if it allows infinite number of (finite) values, it would. (Not sure about an infinite number of/including infinite values, probably also would break math.)
It seems like you cannot precisely define natural numbers using first-order logic. Is that the reason of this all? Or is it a red herring? Would situation be somehow better with second-order logic?
(These are the kinds of questions that I assume would be obvious to me, if I grokked the situation. So the fact that they are not obvious, suggests that I do not see the larger picture.)
What exactly is the aspect of natural numbers that makes them break math, as opposed to other types of values? Intuitively, it seems to be the fact that they can be arbitrarily large but not infinite.
Like, if you invent another data type that only has a finite number of values, it would not allow you to construct something equivalent to Gödel numbering. But if it allows infinite number of (finite) values, it would. (Not sure about an infinite number of/including infinite values, probably also would break math.)
It seems like you cannot precisely define natural numbers using first-order logic. Is that the reason of this all? Or is it a red herring? Would situation be somehow better with second-order logic?
(These are the kinds of questions that I assume would be obvious to me, if I grokked the situation. So the fact that they are not obvious, suggests that I do not see the larger picture.)