Previous | Next --- Slide 20 of 53
Back to Lecture Thumbnails
rsnair

How do you determine what order of approximation you need? For instance linear will work visually for an extrapolated point that is relatively close to the original point. However to what extent will you need to go back and use a higher order approximation?

motoole2

The approximation error depends on the function f(x). If the function is a polynomial of degree k, then a Taylor series of degree k will reproduce the function exactly (not particularly interesting though). For more general functions, then there will some amount of error associated with the Taylor series approximation. One can estimate the error associated with the Taylor series by using the Taylor remainder estimation theorem; see here for a discussion. Briefly though, the error of the approximation will grow at a rate that depends on the difference between x - x_0. The degree of your Taylor series should be picked according to the specific application and the form of f(x).