Perhaps I am not phrasing my question very well. I am not asking about the existnece of universal truths, but about the human intuition that such truths exist. When someone says “2+2=4”, it feels as though they are asserting a necessary truth, something that cannot possibly be otherwise. See, for example, Sniffnoy’s comment above, where he asserts that even if fingers and balls and whatnot counted by integers mod 3, the plain unmodded integers would still exist. This seems to me like an assertion that unmodded arithmetic is a universal truth that cannot be contradicted by any experiment. My question is, ought the thought experiment of a universe whose galaxies and stars are counted by arithmetic mod 3^^^^3 cause us to abandon this intuition?
When someone says “2+2=4”, it feels as though they are asserting a necessary truth, something that cannot possibly be otherwise.
This is an illusion. If I say “37460225182244100253734521345623457115604427833 + 52328763514530238412154321543225430143254061105 = 8978898869677433866588884288884888725858488938” it should not immediately strike you as though I’m asserting a necessary truth that cannot possibly be otherwise.
My question is, ought the thought experiment of a universe whose galaxies and stars are counted by arithmetic mod 3^^^^3 cause us to abandon this intuition?
Counting is an algorithm, or really a sketch of an algorithm. In order to make this a coherent question, i.e. to imagine running an algorithm on that many galaxies and stars and coming up with a certain answer, and then thinking about the consequences, we would need at least
An airtight definition of “galaxies and stars”
A ledger big enough to fit 3^^^3 tickmarks
A reliable enough method of writing down tick marks when we see stars, that when we did it twice and got two different answers, it was not overwhelmingly likely that we had made a mistake someplace.
If I say “37460225182244100253734521345623457115604427833 + 52328763514530238412154321543225430143254061105 = 8978898869677433866588884288884888725858488938” it should not immediately strike you as though I’m asserting a necessary truth that cannot possibly be otherwise.
It immediately strikes me that what you’re asserting is either necessarily true or necessarily false, and whichever it is it could not be otherwise.
Why is the difference relevant? I honestly can’t imagine how someone could be in the position of ‘feeling as though 2+2=4 is either necessarily true or necessarily false’ but not ‘feeling as though it’s necessarily true’.
I honestly can’t imagine how someone could be in the position of ‘feeling as though 2+2=4 is either necessarily true or necessarily false’ but not ‘feeling as though it’s necessarily true’.
That seems to imply you think it would feel different than how you felt at first looking at my sum. Why, besides the fact that it’s much simpler?
I sort of agree, in the sense that “2+2 = 4” is a huge cliche and I have a hard time imagining how someone could not have memorized it in grade school, but that’s part of the reason why I regard the “self-evidence” of this kind of claim as an illusion. We take shortcuts on simple questions.
I believe that “2+2=4 is either necessarily true or necessarily false”. I believe 2+2=4 is necessarily true (modulo definitions). I don’t believe it’s necessarily true that “2+2=4 is necessarily true”.
There’s some pretty strong evidence that the proof that 2+2=4 doesn’t have a mistake in it (heckuva lot of eyeballs). I have good reasons (well, reasons anyway) to believe that mathematical truths are necessary. Thus most of my mass is on “2+2=4 is necessarily true”. Yet, even if it’s necessarily true that “2+2=4 is either necessarily true or necessarily false”, and 2+2=4 is true, it still needn’t be necessarily true that “2+2=4 is necessarily true”, even though 2+2=4 is necessarily true.
If your eyes have glazed over at this point, I’ll just say that Provable(X) doesn’t imply Provable(Provable(X)), and if you think it does, it’s because your ontology of mathematics is wrong and Gödel will eat you.
Not sure what work “necessarily” is doing, but mostly I’m with you. Still, I think this is mistaken:
I’ll just say that Provable(X) doesn’t imply Provable(Provable(X)), and if you think it does, it’s because your ontology of mathematics is wrong and Gödel will eat you.
Though it is true and important that Unprovable(X) does not imply Provable(Unprovable(X)).
Observing that the universe functions by modular arithmetic would not contradict integer arithmetic; it would contradict the theory that counting objects in the universe can be expressed using integer arithmetic.
Models are tested by reference to experiment. If the model only exists in my head, what is the experiment that tests it? If there is no external reference, then in what sense can anything at all be said to contradict the model?
Are models tested by reference to experiment? You can demonstrate that a model is inapplicable in some cases; if it’s inapplicable in all cases, it is useless; I don’t know if it means anything to say that the model itself is false.
This is doubly so when applied to mathematics; mathematics (specifically, logic) is the model that gives a context to the term “contradiction”; that tells what it means and how we know it applies.
Perhaps I am not phrasing my question very well. I am not asking about the existnece of universal truths, but about the human intuition that such truths exist. When someone says “2+2=4”, it feels as though they are asserting a necessary truth, something that cannot possibly be otherwise. See, for example, Sniffnoy’s comment above, where he asserts that even if fingers and balls and whatnot counted by integers mod 3, the plain unmodded integers would still exist. This seems to me like an assertion that unmodded arithmetic is a universal truth that cannot be contradicted by any experiment. My question is, ought the thought experiment of a universe whose galaxies and stars are counted by arithmetic mod 3^^^^3 cause us to abandon this intuition?
This is an illusion. If I say “37460225182244100253734521345623457115604427833 + 52328763514530238412154321543225430143254061105 = 8978898869677433866588884288884888725858488938” it should not immediately strike you as though I’m asserting a necessary truth that cannot possibly be otherwise.
Counting is an algorithm, or really a sketch of an algorithm. In order to make this a coherent question, i.e. to imagine running an algorithm on that many galaxies and stars and coming up with a certain answer, and then thinking about the consequences, we would need at least
An airtight definition of “galaxies and stars”
A ledger big enough to fit 3^^^3 tickmarks
A reliable enough method of writing down tick marks when we see stars, that when we did it twice and got two different answers, it was not overwhelmingly likely that we had made a mistake someplace.
Each of these is preposterous!
It immediately strikes me that what you’re asserting is either necessarily true or necessarily false, and whichever it is it could not be otherwise.
That’s fine, but it’s not at all the same thing.
Why is the difference relevant? I honestly can’t imagine how someone could be in the position of ‘feeling as though 2+2=4 is either necessarily true or necessarily false’ but not ‘feeling as though it’s necessarily true’.
(FWIW I didn’t downvote you.)
That seems to imply you think it would feel different than how you felt at first looking at my sum. Why, besides the fact that it’s much simpler?
I sort of agree, in the sense that “2+2 = 4” is a huge cliche and I have a hard time imagining how someone could not have memorized it in grade school, but that’s part of the reason why I regard the “self-evidence” of this kind of claim as an illusion. We take shortcuts on simple questions.
I believe that “2+2=4 is either necessarily true or necessarily false”. I believe 2+2=4 is necessarily true (modulo definitions). I don’t believe it’s necessarily true that “2+2=4 is necessarily true”.
There’s some pretty strong evidence that the proof that 2+2=4 doesn’t have a mistake in it (heckuva lot of eyeballs). I have good reasons (well, reasons anyway) to believe that mathematical truths are necessary. Thus most of my mass is on “2+2=4 is necessarily true”. Yet, even if it’s necessarily true that “2+2=4 is either necessarily true or necessarily false”, and 2+2=4 is true, it still needn’t be necessarily true that “2+2=4 is necessarily true”, even though 2+2=4 is necessarily true.
If your eyes have glazed over at this point, I’ll just say that Provable(X) doesn’t imply Provable(Provable(X)), and if you think it does, it’s because your ontology of mathematics is wrong and Gödel will eat you.
That’s exceptionally unlikely for more reasons than one might think.
Not sure what work “necessarily” is doing, but mostly I’m with you. Still, I think this is mistaken:
Though it is true and important that Unprovable(X) does not imply Provable(Unprovable(X)).
Observing that the universe functions by modular arithmetic would not contradict integer arithmetic; it would contradict the theory that counting objects in the universe can be expressed using integer arithmetic.
Well, perhaps my question is better phrased as, “What is the referent of mathematics which does not describe objects in the universe?”
It’s a certain model that exists within your mind and your mathematics textbook. :)
Models are tested by reference to experiment. If the model only exists in my head, what is the experiment that tests it? If there is no external reference, then in what sense can anything at all be said to contradict the model?
Are models tested by reference to experiment? You can demonstrate that a model is inapplicable in some cases; if it’s inapplicable in all cases, it is useless; I don’t know if it means anything to say that the model itself is false.
This is doubly so when applied to mathematics; mathematics (specifically, logic) is the model that gives a context to the term “contradiction”; that tells what it means and how we know it applies.