Technically the Peano axioms don’t define addition and multiplication, they only provide a framework for analyzing them.
I know. What I wrote does not particularly imply that they define addition and multiplication: “allowing us to define addition and multiplication”.
In your upper bounded monotonic sequences proof you are using M to mean two different things. Also, the existence of M in the second sentence doesn’t follow, because xm could be less than L for all m.
Good catch—I stayed up pretty late writing the review, and I suppose I should watch out for that next time! However, you can still repair the proof by setting the upper bound to that L−ϵ and repeating up to N times, since it can only escape upwards (or refine its upper bound downwards) by ϵ a finite number of times.
It looks like there is a step missing in the argument somewhere. Also, I don’t see where you have used the completeness axiom, and you certainly can’t prove the result without using completeness.
And why is that? The proof I provided (with the typo fixed and the step filled in) seems sufficient?
There is a much easier example of a function that is uniformly continuous but not Lipschitz continuous.
I found it less interesting.
The assumption that f and g are differentiable on [a,b] is too strong for L’Hôpital’s rule
I’d like to say that I literally took the definition from the book here, but that’s good to know!
I am not sure what you are trying to say about limx→0ln(1−x)−sin(x)1−cos2(x).
It’s a pop culture reference, related to the GIF I had just used. A little bit of trivia.
The Peano axioms aren’t what allows you to define addition and multiplication, though. For example, the induction axiom bears no relation to the definitions of addition and multiplication. (Some of the Peano axioms follow directly from the definitions of addition and multiplication, but really it is the axioms that follow from the definitions and not the other way around.)
You now appear to be proving that for all ϵ, there exists L such that the sequence is eventually ϵ-close to L. Whereas to prove that the sequence converges, you would need to show that there exists an L such that for all ϵ, the sequence is eventually ϵ-close to L.
Regarding limx→0ln(1−x)−sin(x)1−cos2(x)), perhaps I am still not getting the joke, but what I meant was that I don’t think the rule “if f and g are not differentiable on (a,b], then perhaps the limit does not exist” is the rule that you would use to determine that the limit does not exist in the example from the movie, since f and g are in fact differentiable on (a,b] in that example.
The Peano axioms aren’t what allows you to define addition and multiplication, though. For example, the induction axiom bears no relation to the definitions of addition and multiplication. (Some of the Peano axioms follow directly from the definitions of addition and multiplication, but really it is the axioms that follow from the definitions and not the other way around.)
I don’t really view what I wrote as saying “you can’t formalize addition and multiplication without (all of) these specific axioms”. All I was saying is that these axioms provide a framework by which addition and multiplication can be coherently defined, and with which we can prove properties. Sure, the induction axiom is not related, but I don’t see how you can make the same case for the axioms defining the successor function for the natural numbers. That is, in this book, we actually used these axioms to define addition and multiplication. Maybe I’m still missing your point!
You now appear to be proving that for all ϵ, there exists L such that the sequence is eventually ϵ-close to L. Whereas to prove that the sequence converges, you would need to show that there exists an L such that for all ϵ, the sequence is eventually ϵ-close to L.
But I fixed L before doing anything with ϵ?
Regarding lim…
I see how that’s confusing! I was saying that L’Hôpital’s rule is what you would use—not the sentence you had in mind. I’ll clear that up.
Regarding Peano arithmetic, my point is that S(a)+b := S(a+b) (for example) is not the same thing as S(a)+b = S(a+b). The former is a definition, whereas the latter is an statement (which can be used as an axiom). In order to make sense of the definition you need to understand the concept of recursion, whereas you can make sense of the statement just by thinking of addition as “some binary operation that we don’t know what it is, but this is a fact we know about it”. (Let me reiterate that this is a rather technical distinction, I don’t think it makes too much difference either way but I thought it was worth pointing out.)
So, your proof is actually kind of confusingly worded and I am not sure exactly what you mean by “repeating the above argument N times”. It looks like you want to keep the same ϵ but use a different L each time. But you can’t do that if you fix L before introducing ϵ.
Another way to look at it is to consider the same proof but with all variables assumed to be rational. In that case the conclusion of the theorem would be false (because there are bounded monotonic sequences of rationals that don’t converge to any rational) and so you should try to figure out which step of your proof would be wrong in the new context. Ordinarily, it should be where you use the completeness axiom, because the completeness axiom doesn’t hold for the rationals.
Regarding Peano arithmetic, my point is that S(a)+b := S(a+b) (for example) is not the same thing as S(a)+b = S(a+b). The former is a definition, whereas the latter is an statement (which can be used as an axiom). In order to make sense of the definition you need to understand the concept of recursion, whereas you can make sense of the statement just by thinking of addition as “some binary operation that we don’t know what it is, but this is a fact we know about it”. (Let me reiterate that this is a rather technical distinction, I don’t think it makes too much difference either way but I thought it was worth pointing out.)
Especially in math, it’s great to have concepts or diction which I think “clearly have the same shade of connotation” (but actually don’t) brought to my attention. I understand the difference you’re pointing to here, and I’ll be more cognizant of it in the future. Precision of thought and speech is important, so thanks!
So, your proof is actually kind of confusingly worded...
That’s a good point, and one I realized soon after leaving my computer - Ldoesn’t stay fixed in my argument. I think that one should still be able to show that all these different L‘s have to be the same, right? That is (this is just an outline), take any distinct ϵ1,ϵ2>0; then suppose for distinct L1,L2∈R, the sequence is eventually ϵ1-close to L1 and ϵ2-close to L2. Then we have that |L1−L2|<min(ϵ1,ϵ2). Since ϵ1,ϵ2 are arbitrary, we can expand this argument to an arbitrary number of arbitrarily small ϵ‘s to show that all the L’s must be equal.
My intuition is just so strong on this one, and I want to see whether I can extend my proof to cover it, or whether I actually need to just change approaches.
So, the way that I understand your argument, L ranges over the values L,L+ϵ,L+2ϵ,…,L+Nϵ. (Of course there is a circular dependency here but as far as I can tell this was what you were thinking.) Of course, all of these possible values of L are clearly distinct. I’m not really sure what your point is about what happens if you assume that a sequence is eventually close to two different numbers. I think part of the issue is that your proof is a proof by contradiction, but in a proof by contradiction you are only allowed to introduce the contradiction hypothesis once, not an arbitrary number of times.
The fundamental question that any proof of this theorem should answer is: given a bounded monotonic sequence, how do we find the L that it converges to? Your proof doesn’t appear to address this question at all, which makes me think you need a completely different idea.
Note: to address your “strong intuition” perhaps I should say that of course if you have two real numbers L1 and L2 such that for all ϵ1,ϵ2>0, there is a number that is ϵ1-close to L1 and ϵ2-close to L2, then L1=L2. But I don’t think this fact has the significance you want it to have in the broader argument.
I know. What I wrote does not particularly imply that they define addition and multiplication: “allowing us to define addition and multiplication”.
Good catch—I stayed up pretty late writing the review, and I suppose I should watch out for that next time! However, you can still repair the proof by setting the upper bound to that L−ϵ and repeating up to N times, since it can only escape upwards (or refine its upper bound downwards) by ϵ a finite number of times.
And why is that? The proof I provided (with the typo fixed and the step filled in) seems sufficient?
I found it less interesting.
I’d like to say that I literally took the definition from the book here, but that’s good to know!
It’s a pop culture reference, related to the GIF I had just used. A little bit of trivia.
The Peano axioms aren’t what allows you to define addition and multiplication, though. For example, the induction axiom bears no relation to the definitions of addition and multiplication. (Some of the Peano axioms follow directly from the definitions of addition and multiplication, but really it is the axioms that follow from the definitions and not the other way around.)
You now appear to be proving that for all ϵ, there exists L such that the sequence is eventually ϵ-close to L. Whereas to prove that the sequence converges, you would need to show that there exists an L such that for all ϵ, the sequence is eventually ϵ-close to L.
Regarding limx→0ln(1−x)−sin(x)1−cos2(x)), perhaps I am still not getting the joke, but what I meant was that I don’t think the rule “if f and g are not differentiable on (a,b], then perhaps the limit does not exist” is the rule that you would use to determine that the limit does not exist in the example from the movie, since f and g are in fact differentiable on (a,b] in that example.
I don’t really view what I wrote as saying “you can’t formalize addition and multiplication without (all of) these specific axioms”. All I was saying is that these axioms provide a framework by which addition and multiplication can be coherently defined, and with which we can prove properties. Sure, the induction axiom is not related, but I don’t see how you can make the same case for the axioms defining the successor function for the natural numbers. That is, in this book, we actually used these axioms to define addition and multiplication. Maybe I’m still missing your point!
But I fixed L before doing anything with ϵ?
I see how that’s confusing! I was saying that L’Hôpital’s rule is what you would use—not the sentence you had in mind. I’ll clear that up.
Regarding Peano arithmetic, my point is that S(a)+b := S(a+b) (for example) is not the same thing as S(a)+b = S(a+b). The former is a definition, whereas the latter is an statement (which can be used as an axiom). In order to make sense of the definition you need to understand the concept of recursion, whereas you can make sense of the statement just by thinking of addition as “some binary operation that we don’t know what it is, but this is a fact we know about it”. (Let me reiterate that this is a rather technical distinction, I don’t think it makes too much difference either way but I thought it was worth pointing out.)
So, your proof is actually kind of confusingly worded and I am not sure exactly what you mean by “repeating the above argument N times”. It looks like you want to keep the same ϵ but use a different L each time. But you can’t do that if you fix L before introducing ϵ.
Another way to look at it is to consider the same proof but with all variables assumed to be
rational. In that case the conclusion of the theorem would be false (because there are bounded monotonic sequences of rationals that don’t converge to any rational) and so you should try to figure out which step of your proof would be wrong in the new context. Ordinarily, it should be where you use the completeness axiom, because the completeness axiom doesn’t hold for the rationals.
Especially in math, it’s great to have concepts or diction which I think “clearly have the same shade of connotation” (but actually don’t) brought to my attention. I understand the difference you’re pointing to here, and I’ll be more cognizant of it in the future. Precision of thought and speech is important, so thanks!
That’s a good point, and one I realized soon after leaving my computer - L doesn’t stay fixed in my argument. I think that one should still be able to show that all these different L‘s have to be the same, right? That is (this is just an outline), take any distinct ϵ1,ϵ2>0; then suppose for distinct L1,L2∈R, the sequence is eventually ϵ1-close to L1 and ϵ2-close to L2. Then we have that |L1−L2|<min(ϵ1,ϵ2). Since ϵ1,ϵ2 are arbitrary, we can expand this argument to an arbitrary number of arbitrarily small ϵ‘s to show that all the L’s must be equal.
My intuition is just so strong on this one, and I want to see whether I can extend my proof to cover it, or whether I actually need to just change approaches.
So, the way that I understand your argument, L ranges over the values L,L+ϵ,L+2ϵ,…,L+Nϵ. (Of course there is a circular dependency here but as far as I can tell this was what you were thinking.) Of course, all of these possible values of L are clearly distinct. I’m not really sure what your point is about what happens if you assume that a sequence is eventually close to two different numbers. I think part of the issue is that your proof is a proof by contradiction, but in a proof by contradiction you are only allowed to introduce the contradiction hypothesis once, not an arbitrary number of times.
The fundamental question that any proof of this theorem should answer is: given a bounded monotonic sequence, how do we find the L that it converges to? Your proof doesn’t appear to address this question at all, which makes me think you need a completely different idea.
Note: to address your “strong intuition” perhaps I should say that of course if you have two real numbers L1 and L2 such that for all ϵ1,ϵ2>0, there is a number that is ϵ1-close to L1 and ϵ2-close to L2, then L1=L2. But I don’t think this fact has the significance you want it to have in the broader argument.