1,9999.… can only be the same (or equal) to 2 in some kind of imaginary world. The number 1,999… where there is an infinity of 9′s does not “exist” in so far as it cannot be “represented” in a finite amount of space or time. The only way out is to “represent” infinity by (...). So you represent something infinite by something finite, thus avoiding a serious problem. But then stating that 1,999… is equal to 2 becomes a tautology.
Of course mathematicians now are used to deal with infinities. They can manipulate them any which way they want. But in the end, infinity has no equivalent in the “real” world. It is a useful abstraction.
So back to arithmetic. We can only “count” because our physical world is a quantum world. We have units because the basic elements are units, like elementary particles. If the real world were a continuum, there would be no arithmetic. Furthermore, arithmetic is a feature of the macroscopic world. When you look closer, it breaks down. In quantum physics, 1+1 is not always equal to two. You can have many particles in the same quantum state that are indistinguishable. How do you count sheep when you can’t distinguish them?
I don’t see anything “obvious” in stating that 1+1=2. It’s only a convention. “1″ is a symbol. “2” is another symbol. Trace it back to the “real” world, and you find that to have one object plus another of the same object (but distinct) requires subtle physical conditions.
On another note, arithmetic is a recent invention for humanity. Early people couldn’t count to more than about 5, if not 3. Our brain is not that good at counting. That’s why we learn arithmetic tables by heart, and count with our fingers. We have not “evolved” as arithmeticians.
Benoit,
1,9999.… can only be the same (or equal) to 2 in some kind of imaginary world. The number 1,999… where there is an infinity of 9′s does not “exist” in so far as it cannot be “represented” in a finite amount of space or time. The only way out is to “represent” infinity by (...). So you represent something infinite by something finite, thus avoiding a serious problem. But then stating that 1,999… is equal to 2 becomes a tautology.
Of course mathematicians now are used to deal with infinities. They can manipulate them any which way they want. But in the end, infinity has no equivalent in the “real” world. It is a useful abstraction.
So back to arithmetic. We can only “count” because our physical world is a quantum world. We have units because the basic elements are units, like elementary particles. If the real world were a continuum, there would be no arithmetic. Furthermore, arithmetic is a feature of the macroscopic world. When you look closer, it breaks down. In quantum physics, 1+1 is not always equal to two. You can have many particles in the same quantum state that are indistinguishable. How do you count sheep when you can’t distinguish them?
I don’t see anything “obvious” in stating that 1+1=2. It’s only a convention. “1″ is a symbol. “2” is another symbol. Trace it back to the “real” world, and you find that to have one object plus another of the same object (but distinct) requires subtle physical conditions.
On another note, arithmetic is a recent invention for humanity. Early people couldn’t count to more than about 5, if not 3. Our brain is not that good at counting. That’s why we learn arithmetic tables by heart, and count with our fingers. We have not “evolved” as arithmeticians.