ABSTRACT

The basic idea of differential calculus (as perceived by modern mathematics) is to approximate (at a point) a given function by an affine (linear) function (or a first-degree polynomial).

Let J be an interval and c ∈ J . Let f : J → R be given. We wish to approximate f(x) for x near c by a polynomial of the form a + b(x − c). To keep the notation simple, let us assume c = 0. What is meant by approximation? If E(x) := f(x) − a − bx is the error by taking the value of f(x) as a + bx near 0, what we want is that the error goes to zero much faster than x goes to zero. As

we have seen earlier this means that limx→0 f(x)−a−bx

x = 0.