I don’t fully understand why this doesn’t work for some functions which are infinitely differentiable (like logx), but apparently this becomes clearer after some complex analysis.
Because the derivative isn’t zero? (x^2 is infinitely differentiable but ends up at zero fast. (x^2, 2x, 2, 0, 0, 0...))
The question is ill founded. You can in fact recover all of the information about log x from its Taylor series. I think TurnTrout is confused maybe because the Taylor series only converges on a certain interval, not globally? I’ll answer the question assuming that’s the confusion.
If you know all the derivatives of log x at x=1, but you know nothing else about log x, then you can find a Taylor series that converges on (0,2). But, given the Taylor series, you now also know all the derivatives at x=1.9. Writing a Taylor series centered at 1.9, you get a series that converges on (0,3.8). Continuing in this fashion, you can find all values of log x, for all positive real inputs, using only the derivatives at x=1. You just need multiple “steps.”
That said, there is a fundamental limitation. Consider the functions f(x) = 1/x and g(x) = {1/x if x > 0, 1 + 1/x if x < 0}. For x > 0, f(x) = g(x), but for x<0 they are not equal. Clearly both functions are infinitely differentiable, but just because you know all the derivatives of f at x=1, doesn’t mean you can determine it’s value at x=-1.
Okay, so Taylor series allow you to probe all values of a function, but it might take multiple steps, and singularities cause real unfixable problems. The correct way to think about this is that functions aren’t just differentiable or not, they are infinitely differentiable *on a set*. For example, 1/x is smooth on (-infinity,0) union (0,infinity), which is a set with two connected components. The Taylor series allows you to probe all of the values on any individual connected component, but it very obviously can’t tell you anything about other connected components.
As for why it sometimes takes multiple “steps,” like for log x: for reasons, the Taylor series has to converge on a symmetric interval. For log x centered at x=1, it simply can’t converge at 3 without also converging at −1, which is obviously impossible since it’s outside the connected component where log x is differentiable. The Taylor series converges on the largest interval where it can possibly converge, but it still tells you the values elsewhere (in the connected component) if you’re willing to work slightly harder.
Everything I said is true for analytic functions. There is still the issue of infinitely differentiable non-analytic functions as described here. Log x is not an example of such a function, log x is analytic. These counterexamples are much more subtle, but it has to do with the fact that the error in an n-th derivative approximation decays like O(x^n), so even a Taylor series allows for errors like O(e^x) because exponential decay beats any polynomial.
Because the derivative isn’t zero? (x^2 is infinitely differentiable but ends up at zero fast. (x^2, 2x, 2, 0, 0, 0...))
The question is ill founded. You can in fact recover all of the information about log x from its Taylor series. I think TurnTrout is confused maybe because the Taylor series only converges on a certain interval, not globally? I’ll answer the question assuming that’s the confusion.
If you know all the derivatives of log x at x=1, but you know nothing else about log x, then you can find a Taylor series that converges on (0,2). But, given the Taylor series, you now also know all the derivatives at x=1.9. Writing a Taylor series centered at 1.9, you get a series that converges on (0,3.8). Continuing in this fashion, you can find all values of log x, for all positive real inputs, using only the derivatives at x=1. You just need multiple “steps.”
That said, there is a fundamental limitation. Consider the functions f(x) = 1/x and g(x) = {1/x if x > 0, 1 + 1/x if x < 0}. For x > 0, f(x) = g(x), but for x<0 they are not equal. Clearly both functions are infinitely differentiable,
but just because you know all the derivatives of f at x=1, doesn’t mean you can determine it’s value at x=-1.
Okay, so Taylor series allow you to probe all values of a function, but it might take multiple steps, and singularities cause real unfixable problems. The correct way to think about this is that functions aren’t just differentiable or not, they are infinitely differentiable *on a set*. For example, 1/x is smooth on (-infinity,0) union (0,infinity), which is a set with two connected components. The Taylor series allows you to probe all of the values on any individual connected component, but it very obviously can’t tell you anything about other connected components.
As for why it sometimes takes multiple “steps,” like for log x: for reasons, the Taylor series has to converge on a symmetric interval. For log x centered at x=1, it simply can’t converge at 3 without also converging at −1, which is obviously impossible since it’s outside the connected component where log x is differentiable. The Taylor series converges on the largest interval where it can possibly converge, but it still tells you the values elsewhere (in the connected component) if you’re willing to work slightly harder.
Everything I said is true for analytic functions. There is still the issue of infinitely differentiable non-analytic functions as described here. Log x is not an example of such a function, log x is analytic. These counterexamples are much more subtle, but it has to do with the fact that the error in an n-th derivative approximation decays like O(x^n), so even a Taylor series allows for errors like O(e^x) because exponential decay beats any polynomial.
Thank you for this, that’s very helpful.
Counterexample: sinx is analytic but its derivatives don’t satisfy your proposed condition for being analytic.