If you mean, for any error size, the supremum of distance between the linear approximation and the function is lower than this error for all scales smaller than a given scale, then a necessary and sufficient condition is “continuous”. Differentiable is merely sufficient.
When the function is differentiable, you can make claims on how fast the error decreases asymptotically with scale.
And if you use the ArthurB definition of “approximately” (which is an excellent definition for many purposes), then a piecewise constant function would do just as well.
But I may have gotten “scale” wrong here. If we scale the error at the same time as we scale the part we’re looking at, then differentiability is necessary and sufficient. If we’re concerned about approximating the function, on a smallish part, then continuous is what we’re looking for.
Question is, what do you mean “approximately”.
If you mean, for any error size, the supremum of distance between the linear approximation and the function is lower than this error for all scales smaller than a given scale, then a necessary and sufficient condition is “continuous”. Differentiable is merely sufficient.
When the function is differentiable, you can make claims on how fast the error decreases asymptotically with scale.
And if you use the ArthurB definition of “approximately” (which is an excellent definition for many purposes), then a piecewise constant function would do just as well.
Indeed.
But I may have gotten “scale” wrong here. If we scale the error at the same time as we scale the part we’re looking at, then differentiability is necessary and sufficient. If we’re concerned about approximating the function, on a smallish part, then continuous is what we’re looking for.