Cool, another one! I’m supposed to be sleeping now rather than working, so I can engage with this.
(b), “there are mysterious forces at work here”
we would have to multiply by infinity and that wouldn’t prove anything because we already know such operations are suspect.
Infinity is weird, and it makes math weird. I think a fuzzy version of this belief is pretty widespread—look what you get when you do an image search for “divide by zero”, for example. For me, and I suspect for a lot of people with a very little general math knowledge, “infinity” is a stop sign. Inquiry ends, shoulders are shrugged, hands are thrown up. “Of course it doesn’t appear to make sense—it’s got infinity it it!”.
I don’t remember where I got this notion but it must have been early, because I remember seeing a version of the “disguise a division by zero > 1=2” trick in a book (Fermat’s Last Theorem by Simon Singh, if anyone’s interested) when I was about 14 and being baffled by it, and going over and over it trying to find the mistake. When I gave up and read on, and saw the explanation of how one of the canceled terms in the equation was zero, I was instantly satisfied. “Oh, of course. It divides by zero which is a sneaky way of introducing infinity to the mix—so naturally the result makes no sense.”
This is one of those situations where a little incomplete knowledge is actually worse than none—a person who hadn’t ever heard about the infinity-makes-everything-weird “rule” could see something like 0.999… = 1 and keep digging, instead of saying “yeah, that’s infinity for you, what can you do”.
The idea that infinity is some sort of magical spell that you can cast upon “real” math and turn it into a frog (using real in the everyday sense, not the math-sense) is obviously an irrational thought-stopper. It means you could present a false statement to me and I wouldn’t question it so long as infinity was there to point to as the culprit.
(If you’re able to quickly formulate an example of a superficially math-y looking proposition involving infinity that’s actually total BS, that would be awesome—I could use it in future conversations about the topic.)
By the way, I’m not talking about some version of me in the distant past—I realized that I use “infinity makes everything weird” as a thought-terminating cliche five minutes ago. I didn’t realize I was exempting mathematics from the same sort of bias-questioning rationality I try to apply to everything else until you pointed it out.
So, thanks for that—I still may not understand why 0.999… = 1, or how dividing by zero leads to results like 1=2, but at least from now on I won’t let a non-answer like “infinity did it!” kill my curiosity.
Infinities are okay if they come with a definition of convergence. For example, we can say that an infinite sequence of real numbers x1, x2, x3… “converges” to a real number y if every interval of the real line centered around y, no matter how small, contains all but finitely many elements of the sequence. For example, the sequence 1, 1⁄2, 1⁄3, 1⁄4… converges to 0, because every interval centered around 0 contains all but finitely many of 1, 1⁄2, 1⁄3, 1⁄4… Some sequences don’t converge to anything, like 0, 1, 0, 1..., but it’s an easy exercise to prove that no sequence can converge to two different values at once.
Now the only sensible way to understand 0.999… is to define it as whatever value 0.9, 0.99, 0.999… converges to. But that’s obviously 1 and that’s the end of the story for people who understand math.
You can use the same procedure for infinite sums. x1+x2+x3+… can be defined as whatever value x1, x1+x2, x1+x2+x3… converges to. For example, 1+1/2+1/4+1/8+… = 2, because the sequence of partial sums is 2-1, 2-1/2, 2-1/4, 2-1/8, … and converges to 2.
By now it should be clear that 1+2+3+4+… doesn’t converge to anything under our definition. But our definition isn’t the only one possible. You can make another self-consistent definition of convergence, where 1+2+3+4+… will indeed converge to −1/12. But that definition is complex, esoteric and much less useful than the regular one, which is why that viral video really shouldn’t have used it without remark.
Most paradoxes involving infinity are just pulling a fast one on you by not specifying what they mean by convergence. If you try to use the common sense definition above, or really any self-consistent way to assign values to infinite expressions, the paradoxes usually go away.
Here’s how dividing by zero leads to results like 1=2:
You may have heard that functions must be well-defined, which means x=y ⇒ f(x)=f(y). This property of functions is what allows you to apply any function to both sides of an equation and preserve truth doing it. If the function is one-to-one (ie x=y ⇔ f(x)=f(y)), truth is preserved both ways and you can un-apply a function from both sides of an equation as well. Multiplication by a factor c is one-to-one iff c isn’t 0. Therefore, un-applying multiplication by 0 is not in general truth-preserving.
Cool, another one! I’m supposed to be sleeping now rather than working, so I can engage with this.
Infinity is weird, and it makes math weird. I think a fuzzy version of this belief is pretty widespread—look what you get when you do an image search for “divide by zero”, for example. For me, and I suspect for a lot of people with a very little general math knowledge, “infinity” is a stop sign. Inquiry ends, shoulders are shrugged, hands are thrown up. “Of course it doesn’t appear to make sense—it’s got infinity it it!”.
I don’t remember where I got this notion but it must have been early, because I remember seeing a version of the “disguise a division by zero > 1=2” trick in a book (Fermat’s Last Theorem by Simon Singh, if anyone’s interested) when I was about 14 and being baffled by it, and going over and over it trying to find the mistake. When I gave up and read on, and saw the explanation of how one of the canceled terms in the equation was zero, I was instantly satisfied. “Oh, of course. It divides by zero which is a sneaky way of introducing infinity to the mix—so naturally the result makes no sense.”
This is one of those situations where a little incomplete knowledge is actually worse than none—a person who hadn’t ever heard about the infinity-makes-everything-weird “rule” could see something like 0.999… = 1 and keep digging, instead of saying “yeah, that’s infinity for you, what can you do”.
The idea that infinity is some sort of magical spell that you can cast upon “real” math and turn it into a frog (using real in the everyday sense, not the math-sense) is obviously an irrational thought-stopper. It means you could present a false statement to me and I wouldn’t question it so long as infinity was there to point to as the culprit.
(If you’re able to quickly formulate an example of a superficially math-y looking proposition involving infinity that’s actually total BS, that would be awesome—I could use it in future conversations about the topic.)
By the way, I’m not talking about some version of me in the distant past—I realized that I use “infinity makes everything weird” as a thought-terminating cliche five minutes ago. I didn’t realize I was exempting mathematics from the same sort of bias-questioning rationality I try to apply to everything else until you pointed it out.
So, thanks for that—I still may not understand why 0.999… = 1, or how dividing by zero leads to results like 1=2, but at least from now on I won’t let a non-answer like “infinity did it!” kill my curiosity.
Infinities are okay if they come with a definition of convergence. For example, we can say that an infinite sequence of real numbers x1, x2, x3… “converges” to a real number y if every interval of the real line centered around y, no matter how small, contains all but finitely many elements of the sequence. For example, the sequence 1, 1⁄2, 1⁄3, 1⁄4… converges to 0, because every interval centered around 0 contains all but finitely many of 1, 1⁄2, 1⁄3, 1⁄4… Some sequences don’t converge to anything, like 0, 1, 0, 1..., but it’s an easy exercise to prove that no sequence can converge to two different values at once.
Now the only sensible way to understand 0.999… is to define it as whatever value 0.9, 0.99, 0.999… converges to. But that’s obviously 1 and that’s the end of the story for people who understand math.
You can use the same procedure for infinite sums. x1+x2+x3+… can be defined as whatever value x1, x1+x2, x1+x2+x3… converges to. For example, 1+1/2+1/4+1/8+… = 2, because the sequence of partial sums is 2-1, 2-1/2, 2-1/4, 2-1/8, … and converges to 2.
By now it should be clear that 1+2+3+4+… doesn’t converge to anything under our definition. But our definition isn’t the only one possible. You can make another self-consistent definition of convergence, where 1+2+3+4+… will indeed converge to −1/12. But that definition is complex, esoteric and much less useful than the regular one, which is why that viral video really shouldn’t have used it without remark.
Most paradoxes involving infinity are just pulling a fast one on you by not specifying what they mean by convergence. If you try to use the common sense definition above, or really any self-consistent way to assign values to infinite expressions, the paradoxes usually go away.
Here’s how dividing by zero leads to results like 1=2:
You may have heard that functions must be well-defined, which means x=y ⇒ f(x)=f(y). This property of functions is what allows you to apply any function to both sides of an equation and preserve truth doing it. If the function is one-to-one (ie x=y ⇔ f(x)=f(y)), truth is preserved both ways and you can un-apply a function from both sides of an equation as well. Multiplication by a factor c is one-to-one iff c isn’t 0. Therefore, un-applying multiplication by 0 is not in general truth-preserving.