How to compute the error bounds on a Taylor approximation (in Python, using SymPy)
Task
A Taylor series approximation of degree
How can we compute this error bound using mathematical software?
Related tasks:
Solution
This answer assumes you have imported SymPy as follows.
1
2
from sympy import * # load all math functions
init_printing( use_latex='mathjax' ) # use pretty math output
Let’s create a simple example. We’ll be approximating
1
2
3
4
5
var( 'x' )
formula = sin(x)
a = 0
x_0 = 1
n = 5
We will not ask SymPy to compute the formula exactly, but will instead
have it sample a large number of
1
2
3
4
5
6
7
8
# Get 1000 evenly-spaced c values:
cs = [ Min(x_0,a) + abs(x_0-a)*i/1000 for i in range(1001) ]
# Create the formula |f^(n+1)(x)|:
formula2 = abs( diff( formula, x, n+1 ) )
# Find the max of it on all the 1000 values:
m = Max( *[ formula2.subs(x,c) for c in cs ] )
# Compute the error bound:
N( abs(x_0-a)**(n+1) / factorial(n+1) * m )
The error is at most
Content last modified on 24 July 2023.
See a problem? Tell us or edit the source.
Contributed by Nathan Carter (ncarter@bentley.edu)